fire many transactions at Ethereum node, then produce diagrams of TPS, blocktime, gasUsed and gasLimit, and blocksize.

drandreaskrueger, updated 🕥 2023-02-15 20:08:10

NEWS 2019-Feb-23 - video! Explaining software v55 video released on youtube: --> watch

chainhammer-logo.png


chainhammer v59

TPS measurements of parity aura, geth clique, quorum, tobalaba, etc. It should work with any Ethereum type chain; we focused on PoA consensus.

instructions

video

The brand new release v55 is now presented & explained in a useful video on youtube.

folders

  • hammer/ - submits many transactions, while watching the recent blocks
  • reader/ - reads blocks; visualizes TPS, blocktime, gas, bytes - see reader/README.md
  • docs/ - see esp. reproduce.md, cloud.md, FAQ.md, new: azure.md
  • results/ - for each client one markdown file; results/runs/ - auto-generated pages
  • logs/ - check this first if problems
  • networks/ - network starters & external repos via install script, see below
  • scripts/ - installers and other iseful bash scripts
  • env/ - Python virtualenv, created via install script, see below
  • tests/ - start whole integration test suite via ./pytest.sh

chronology

See the results/ folder:

  1. log.md: initial steps; also tried Quorum's private transactions
  2. quorum.md: raft consensus, quorum is a geth fork
  3. tobalaba.md: parity fork of EnergyWebFoundation
  4. quorum-IBFT.md: IstanbulBFT, 2nd consensus algo in quorum
  5. geth.md: geth clique PoA algorithm
  6. parity.md: parity aura PoA algorithm, many attempts to accelerate
  7. eos.md: not begun yet
  8. substrate.md: not begun yet

results summary

Outdated table in which I had run each of the experiments manually in autumn 2018; soon re-done completely, using the below automation. So please contact me now, if you know how to accelerate any of these clients:

| hardware | node type | #nodes | config | peak TPS_av | final TPS_av | |----------- |----------- |-------- |-------- |------------- |-------------- | | t2.micro | parity aura | 4 | (D) | 45.5 | 44.3 | | t2.large | parity aura | 4 | (D) | 53.5 | 52.9 | | t2.xlarge | parity aura | 4 | (J) | 57.1 | 56.4 | | t2.2xlarge | parity aura | 4 | (D) | 57.6 | 57.6 | | | | | | | | | t2.micro | parity instantseal | 1 | (G) | 42.3 | 42.3 | | t2.xlarge | parity instantseal | 1 | (J) | 48.1 | 48.1 | | | | | | | | | t2.2xlarge | geth clique | 3+1 +2 | (B) | 421.6 | 400.0 | | t2.xlarge | geth clique | 3+1 +2 | (B) | 386.1 | 321.5 | | t2.xlarge | geth clique | 3+1 | (K) | 372.6 | 325.3 | | t2.large | geth clique | 3+1 +2 | (B) | 170.7 | 169.4 | | t2.small | geth clique | 3+1 +2 | (B) | 96.8 | 96.5 | | t2.micro | geth clique | 3+1 | (H) | 124.3 | 122.4 | | | | | | | | | t2.micro SWAP | quorum crux IBFT | 4 | (I) SWAP! | 98.1 | 98.1 | | | | | | | | | t2.micro | quorum crux IBFT | 4 | (F) | lack of RAM | | | t2.large | quorum crux IBFT | 4 | (F) | 207.7 | 199.9 | | t2.xlarge | quorum crux IBFT | 4 | (F) | 439.5 | 395.7 | | t2.xlarge | quorum crux IBFT | 4 | (L) | 389.1 | 338.9 | | t2.2xlarge | quorum crux IBFT | 4 | (F) | 435.4 | 423.1 | | c5.4xlarge | quorum crux IBFT | 4 | (F) test_getNearestEntry() | 536.4 | 524.3 |

Reproduce these results easily; for the config column also see there. Quickest reproduction with my Amazon AMI readymade image. And see that bottom of parity.md and geth.md and quorum-IBFT.md for the latest runs, issues, and additional details.

faster wider more

  • how I initially got this faster, on Quorum, step by step, please do read the 1st logbook log.md
  • then I improved per client, see each in #chronology above
  • (possible TODOs - any other ideas?)

but not much more needed = the current version is already fully automated. Use it! May it help you to improve the speed of your Ethereum client!

you

Add yourself to other-projects.md using chainhammer, or projects which are similar to this.

(Especially if you work in one of the dev teams, you know your client code best - ) please try to improve the above results, e.g. by varying the CLI arguments with which the nodes are started; I don't see that as my job, you will be much more successful with that.

See parity PE#9393, parity SE#58521, geth GE#17447, quorum Q#479.

Please report back when you have done other / new measurements.

install and run

All this is developed and much tested on Debian, locally and in the AWS cloud. New: Ubuntu now also supported, see below.

quickstart

N.B.: Better do this on a disposable cloud, or virtualbox machine; because the installation makes lasting changes and needs sudo!

After unpacking a ZIP of the downloaded repo, or by git clone https://github.com/drandreaskrueger/chainhammer drandreaskrueger_chainhammer ln -s drandreaskrueger_chainhammer CH cd CH

you now only need these two lines to prepare and run the 1st experiment! scripts/install.sh CH_TXS=1000 CH_THREADING="sequential" ./run.sh $HOSTNAME-TestRPC testrpc You will then have a diagram, and a HTML and MD page about this run!

(on Ubuntu instead: scripts/install.sh docker ubuntu )

activate docker

Better now logout & login, or close the terminal, and open a new terminal, because the above scripts/install.sh might have enabled docker for the the first time for this user. Then:

All supported clients in one go:

For the full integration test, run each client for a short moment: export CH_MACHINE=yourChoice ./run-all_small.sh

For detailed instructions, please see docs/, esp. reproduce.md, and for troubleshooting FAQ.md and github issues.

benchmarking a remote node

Chainhammer can now be stripped down to its pure benchmarking abilities, i.e. without the installation of docker and without the three local network starters (parity-deploy, geth-dev, quorum-crux). It was successfully used to benchmark the Microsoft Azure blockchain-as-a-service product. The essential difference is to start the installation with the switch nodocker:

scripts/install.sh nodocker

So, if you just want to benchmark your existing Ethereum node or network, have a look at the manual docs/azure.md .

unittests

./pytest.sh enables the virtualenv, then starts a testrpc-py Ethereum simulator on http://localhost:8545 in the background, logging into tests/logs/; then runs ./deploy.py andtests; and finally runs all the unittests, also logging into tests/logs/.

(Instead of testrpc-py) if you want to run tests with another node, just start that; and run pytest manually: source env/bin/activate py.test -v --cov

There were 98 tests on January 23rd, all 98 PASSED (see this logfile --> cat tests/logs/*.ansi because colors) on these different Ethereum providers:

  • testrpc instantseal (testrpc-py) 13 seconds
  • geth Clique (geth-dev) 63 seconds
  • quorum IBFT (blk-io/crux) 59 seconds
  • parity instantseal (parity-deploy) 8 seconds
  • parity aura (parity-deploy) 72 seconds

credits

Please credit this as:

benchmarking scripts "chainhammer"
maintainer: Dr Andreas Krueger 2018-2020
https://github.com/drandreaskrueger/chainhammer

Consider to submit your improvements & usage as pull request. Thanks.

development was supported by

v01-v35 financed by Electron.org.uk 2018
v40-v55 financed by Web3Foundation 2018-2019
v58-v59 financed by Microsoft Azure 2019

logo

Thank you very much!

short summary

The open source tools 'chainhammer' submits a high load of smart contract transactions to an Ethereum based blockchain, then 'chainreader' reads the whole chain, and produces diagrams of TPS, blocktime, gasUsed and gasLimit, and the blocksize. https://github.com/drandreaskrueger/chainhammer




```

The following diagrams are outdated! Just make your own, new ones, with:

CH_MACHINE=yourChoice ./run-all_large.sh ```

chainhammer: hammer --> reader --> diagrams

examples:

geth clique on AWS t2.xlarge

geth.md = geth (go ethereum client), "Clique" consensus.

50,000 transactions to an Amazon t2.xlarge machine.

Interesting artifact that after ~14k transactions, the speed drops considerably - but recovers again. Reported.

geth-clique-50kTx_t2xlarge_tps-bt-bs-gas_blks12-98.png
reader/img/geth-clique-50kTx_t2xlarge_tps-bt-bs-gas_blks12-98.png

quorum IBFT on AWS t2.xlarge

quorum-IBFT.md = Quorum (geth fork), IBFT consensus, 20 millions gasLimit, 1 second istanbul.blockperiod; 20000 transactions multi-threaded with 23 workers. Initial average >400 TPS then drops to below 300 TPS, see quorum issue)

quorum-crux-IBFT_t2xlarge_tps-bt-bs-gas_blks320-395.png

quorum raft

OLD RUN on a desktop machine.

quorum.md = Quorum (geth fork), raft consensus, 1000 transactions multi-threaded with 23 workers, average TPS around 160 TPS, and 20 raft blocks per second) reader/img/quorum_tps-bt-bs-gas_blks242-357.png

tobalaba

OLD RUN on a desktop machine.

tobalaba.md = Public "Tobalaba" chain of the EnergyWebFoundation (parity fork), PoA; 20k transactions; > 150 TPS if client is well-connected.

reader/img/tobalaba_tps-bt-bs-gas_blks5173630-5173671.png

parity aura v1.11.11 on AWS t2.xlarge

parity.md#run-18 = using parity-deploy.sh dockerized network of 4 local nodes with increased gasLimit, and 5 seconds blocktime; 20k transactions; ~ 60 TPS on an Amazon t2.xlarge machine.

N.B.: Could not work with parity v2 yet because of bugs PD#76 and PE#9582 --> everything still on parity v1.11.11

parity-v1.11.11-aura_t2xlarge_tps-bt-bs-gas_blks5-85.png
parity-v1.11.11-aura_t2xlarge_tps-bt-bs-gas_blks5-85.png

Calling all parity experts: How to improve these too slow TPS results?

See issue PE#9393, and the detailed log of what I've tried already, and the 2 shortest routes to reproducing the results: reproduce.md.

Thanks.

Issues

Getting several errors running scripts/install-virtualenv.sh

opened on 2023-03-10 00:41:38 by ashishchandr70

I ran install.sh nodocker but that failed because pandas could not be built.

Then I followed the instructions in https://github.com/drandreaskrueger/chainhammer/issues/21#issuecomment-707997775 but still getting errors running scripts/install-virtualenv.sh

My environment: $ cat /etc/*release* ID="ec2" VERSION="20230124-1270" PRETTY_NAME="Debian GNU/Linux 11 (bullseye)" NAME="Debian GNU/Linux" VERSION_ID="11" VERSION="11 (bullseye)" VERSION_CODENAME=bullseye ID=debian HOME_URL="https://www.debian.org/" SUPPORT_URL="https://www.debian.org/support" BUG_REPORT_URL="https://bugs.debian.org/" Output of uname -a

$ uname -a Linux ip-172-31-44-113 5.10.0-21-cloud-amd64 #1 SMP Debian 5.10.162-1 (2023-01-21) x86_64 GNU/Linux

Error trace (some snipping donw by me to fit the github character limit):

``` $ scripts/install-virtualenv.sh

create chainhammer virtualenv

after possibly removing a whole existing env/ folder !!!

the new virtualenv will be installed below here: /home/admin/chainhammer

Think twice. Then press enter to continue

++ rm -rf env ++ python3 -m venv env ++ echo

++ set +x +++ source env/bin/activate ++ echo

++ python3 -m pip install --upgrade pip==18.1 Collecting pip==18.1 Using cached pip-18.1-py2.py3-none-any.whl (1.3 MB) Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 20.3.4 Uninstalling pip-20.3.4: Successfully uninstalled pip-20.3.4 Successfully installed pip-18.1 ++ pip3 install wheel Collecting wheel Using cached https://files.pythonhosted.org/packages/bd/7c/d38a0b30ce22fc26ed7dbc087c6d00851fb3395e9d0dac40bec1f905030c/wheel-0.38.4-py3-none-any.whl Installing collected packages: wheel Successfully installed wheel-0.38.4 You are using pip version 18.1, however version 23.0.1 is available. You should consider upgrading via the 'pip install --upgrade pip' command. ++ pip3 install --upgrade py-solc==3.2.0 web3==4.8.2 'web3[tester]==4.8.2' rlp==0.6.0 eth-testrpc==1.3.5 requests==2.21.0 pandas==1.1.2 matplotlib==3.3.2 pytest==4.0.2 pytest-cov==2.6.0 jupyter==1.0.0 ipykernel==5.1.0 Collecting py-solc==3.2.0 Using cached https://files.pythonhosted.org/packages/47/74/d36abca3f36ccdcd04976c50f83502c870623e5beb4a4ec96c7bad4bb9e8/py_solc-3.2.0-py3-none-any.whl Collecting web3==4.8.2 Using cached https://files.pythonhosted.org/packages/84/7b/8dfe018c0b94a68f88d98ff39c11471ac55ffbcb22cd7ab41010c1476a75/web3-4.8.2-py3-none-any.whl Collecting rlp==0.6.0 Collecting eth-testrpc==1.3.5 Using cached https://files.pythonhosted.org/packages/bc/9a/8a8c90b8ed4db0afc39bc7b67b52aa8cbbc9c08bbd93f7ca92719e3493a3/eth_testrpc-1.3.5-py3-none-any.whl Collecting requests==2.21.0 Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl Collecting pandas==1.1.2 Using cached https://files.pythonhosted.org/packages/64/f1/8fdbd74edfc31625d597717be8c155c6226fc72a7c954c52583ab81a8614/pandas-1.1.2.tar.gz Installing build dependencies ... done Collecting matplotlib==3.3.2 Collecting pytest==4.0.2 Using cached https://files.pythonhosted.org/packages/19/80/1ac71d332302a89e8637456062186bf397abc5a5b663c1919b73f4d68b1b/pytest-4.0.2-py2.py3-none-any.whl Collecting pytest-cov==2.6.0 Using cached https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl Building wheels for collected packages: pandas, pillow Running setup.py bdist_wheel for pandas ... error Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" bdist_wheel -d /tmp/pip-wheel-2ez98py3 --python-tag cp39: /tmp/pip-install-t6avqa7g/pandas/setup.py:45: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. _CYTHON_INSTALLED = _CYTHON_VERSION >= LooseVersion(min_cython_ver) /tmp/pip-install-t6avqa7g/pandas/setup.py:491: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. if np.version < LooseVersion("1.16.0"): /tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/setuptools/init.py:85: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try pip install --use-pep517. dist.fetch_build_eggs(dist.setup_requires) running bdist_wheel running build running build_py creating build creating build/lib.linux-x86_64-cpython-39 creating build/lib.linux-x86_64-cpython-39/pandas [...]

UPDATING build/lib.linux-x86_64-cpython-39/pandas/_version.py set build/lib.linux-x86_64-cpython-39/pandas/_version.py to '1.1.2' running build_ext building 'pandas._libs.algos' extension creating build/temp.linux-x86_64-cpython-39 creating build/temp.linux-x86_64-cpython-39/pandas creating build/temp.linux-x86_64-cpython-39/pandas/_libs [...] building 'pandas._libs.writers' extension x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/writers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/writers.o -Werror pandas/_libs/writers.c: In function ‘__pyx_f_6pandas_5_libs_7writers_word_len’: pandas/_libs/writers.c:5099:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations] 5099 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:35: /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here 446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject op) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~ pandas/_libs/writers.c:5099:5: error: ‘PyUnicode_AsUnicode’ is deprecated [-Werror=deprecated-declarations] 5099 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:35: /usr/include/python3.9/cpython/unicodeobject.h:580:45: note: declared here 580 | Py_DEPRECATED(3.3) PyAPI_FUNC(Py_UNICODE ) PyUnicode_AsUnicode( | ^~~~~~~~~~~~~~~~~~~ pandas/_libs/writers.c:5099:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations] 5099 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:35: /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here 446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~ cc1: all warnings being treated as errors error: command '/usr/bin/x86_64-linux-gnu-gcc' failed with exit code 1


Failed building wheel for pandas Running setup.py clean for pandas Running setup.py bdist_wheel for pillow ... error Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t6avqa7g/pillow/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" bdist_wheel -d /tmp/pip-wheel-svlivdp3 --python-tag cp39: running bdist_wheel running build running build_py Generating grammar tables from /usr/lib/python3.9/lib2to3/Grammar.txt Generating grammar tables from /usr/lib/python3.9/lib2to3/PatternGrammar.txt creating build creating build/lib.linux-x86_64-3.9 creating build/lib.linux-x86_64-3.9/PIL [...] no previously-included directories found matching '.ci' writing manifest file 'src/Pillow.egg-info/SOURCES.txt' running build_ext

The headers or library files could not be found for jpeg, a required dependency when compiling Pillow from source.

Please see the install instructions at: https://pillow.readthedocs.io/en/latest/installation.html

Traceback (most recent call last): File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 993, in setup( File "/home/admin/chainhammer/env/lib/python3.9/site-packages/setuptools/init.py", line 162, in setup return distutils.core.setup(**attrs) File "/usr/lib/python3.9/distutils/core.py", line 148, in setup dist.run_commands() File "/usr/lib/python3.9/distutils/dist.py", line 966, in run_commands self.run_command(cmd) File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/home/admin/chainhammer/env/lib/python3.9/site-packages/wheel/bdist_wheel.py", line 325, in run self.run_command("build") File "/usr/lib/python3.9/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/usr/lib/python3.9/distutils/command/build.py", line 135, in run self.run_command(cmd_name) File "/usr/lib/python3.9/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/home/admin/chainhammer/env/lib/python3.9/site-packages/setuptools/command/build_ext.py", line 84, in run _build_ext.run(self) File "/usr/lib/python3.9/distutils/command/build_ext.py", line 340, in run self.build_extensions() File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 808, in build_extensions raise RequiredDependencyException(f) main.RequiredDependencyException: jpeg

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "", line 1, in File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 1010, in raise RequiredDependencyException(msg) main.RequiredDependencyException:

The headers or library files could not be found for jpeg, a required dependency when compiling Pillow from source.

Please see the install instructions at: https://pillow.readthedocs.io/en/latest/installation.html


Failed building wheel for pillow Running setup.py clean for pillow Failed to build pandas pillow eth-utils 1.10.0 has requirement eth-hash<0.4.0,>=0.3.1, but you'll have eth-hash 0.5.1 which is incompatible. eth-rlp 0.3.0 has requirement eth-utils<3,>=2.0.0, but you'll have eth-utils 1.10.0 which is incompatible. jupyter-console 6.6.3 has requirement ipykernel>=6.14, but you'll have ipykernel 5.1.0 which is incompatible. Installing collected packages: semantic-version, py-solc, eth-typing, pycryptodome, eth-hash, toolz, cytoolz, eth-utils, six, parsimonious, eth-abi, lru-dict, eth-keys, eth-keyfile, hexbytes, attrdict, rlp, eth-rlp, eth-account, websockets, idna, urllib3, chardet, certifi, requests, web3, MarkupSafe, Werkzeug, click, PyYAML, repoze.lru, pbkdf2, scrypt, bitcoin, pyethash, pysha3, pycparser, cffi, secp256k1, ethereum, json-rpc, eth-testrpc, python-dateutil, pytz, numpy, pandas, kiwisolver, pillow, pyparsing, cycler, matplotlib, py, atomicwrites, attrs, more-itertools, pluggy, pytest, coverage, pytest-cov, pickleshare, backcall, ptyprocess, pexpect, asttokens, pure-eval, executing, stack-data, decorator, pygments, parso, jedi, traitlets, wcwidth, prompt-toolkit, matplotlib-inline, ipython, zipp, importlib-metadata, tornado, pyzmq, platformdirs, jupyter-core, jupyter-client, ipykernel, jupyterlab-widgets, widgetsnbextension, ipywidgets, terminado, argon2-cffi-bindings, argon2-cffi, pyrsistent, jsonschema, fastjsonschema, nbformat, Send2Trash, jinja2, defusedxml, nbclient, soupsieve, beautifulsoup4, mistune, pandocfilters, jupyterlab-pygments, webencodings, bleach, packaging, tinycss2, nbconvert, nest-asyncio, sniffio, anyio, websocket-client, jupyter-server-terminals, prometheus-client, rfc3986-validator, python-json-logger, rfc3339-validator, jupyter-events, jupyter-server, notebook-shim, ipython-genutils, nbclassic, notebook, jupyter-console, qtpy, qtconsole, jupyter Running setup.py install for pandas ... error Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-record-aq3utvkw/install-record.txt --single-version-externally-managed --compile --install-headers /home/admin/chainhammer/env/include/site/python3.9/pandas: /tmp/pip-install-t6avqa7g/pandas/setup.py:45: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. _CYTHON_INSTALLED = _CYTHON_VERSION >= LooseVersion(min_cython_ver) /tmp/pip-install-t6avqa7g/pandas/setup.py:491: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. if np.version < LooseVersion("1.16.0"): Compiling pandas/_libs/algos.pyx because it changed. Compiling pandas/_libs/groupby.pyx because it changed. Compiling pandas/_libs/hashing.pyx because it changed. Compiling pandas/_libs/hashtable.pyx because it changed. Compiling pandas/_libs/index.pyx because it changed. Compiling pandas/_libs/indexing.pyx because it changed. Compiling pandas/_libs/internals.pyx because it changed. Compiling pandas/_libs/interval.pyx because it changed. Compiling pandas/_libs/join.pyx because it changed. Compiling pandas/_libs/lib.pyx because it changed. Compiling pandas/_libs/missing.pyx because it changed. Compiling pandas/_libs/parsers.pyx because it changed. Compiling pandas/_libs/reduction.pyx because it changed. Compiling pandas/_libs/ops.pyx because it changed. Compiling pandas/_libs/ops_dispatch.pyx because it changed. Compiling pandas/_libs/properties.pyx because it changed. Compiling pandas/_libs/reshape.pyx because it changed. Compiling pandas/_libs/sparse.pyx because it changed. Compiling pandas/_libs/tslib.pyx because it changed. Compiling pandas/_libs/tslibs/base.pyx because it changed. Compiling pandas/_libs/tslibs/ccalendar.pyx because it changed. Compiling pandas/_libs/tslibs/dtypes.pyx because it changed. Compiling pandas/_libs/tslibs/conversion.pyx because it changed. Compiling pandas/_libs/tslibs/fields.pyx because it changed. Compiling pandas/_libs/tslibs/nattype.pyx because it changed. Compiling pandas/_libs/tslibs/np_datetime.pyx because it changed. Compiling pandas/_libs/tslibs/offsets.pyx because it changed. Compiling pandas/_libs/tslibs/parsing.pyx because it changed. Compiling pandas/_libs/tslibs/period.pyx because it changed. Compiling pandas/_libs/tslibs/strptime.pyx because it changed. Compiling pandas/_libs/tslibs/timedeltas.pyx because it changed. Compiling pandas/_libs/tslibs/timestamps.pyx because it changed. Compiling pandas/_libs/tslibs/timezones.pyx because it changed. Compiling pandas/_libs/tslibs/tzconversion.pyx because it changed. Compiling pandas/_libs/tslibs/vectorized.pyx because it changed. Compiling pandas/_libs/testing.pyx because it changed. Compiling pandas/_libs/window/aggregations.pyx because it changed. Compiling pandas/_libs/window/indexers.pyx because it changed. Compiling pandas/_libs/writers.pyx because it changed. Compiling pandas/io/sas/sas.pyx because it changed. [ 1/40] Cythonizing pandas/_libs/algos.pyx [ 2/40] Cythonizing pandas/_libs/groupby.pyx warning: pandas/_libs/groupby.pyx:1134:26: Unreachable code [ 3/40] Cythonizing pandas/_libs/hashing.pyx [ 4/40] Cythonizing pandas/_libs/hashtable.pyx [ 5/40] Cythonizing pandas/_libs/index.pyx [ 6/40] Cythonizing pandas/_libs/indexing.pyx [ 7/40] Cythonizing pandas/_libs/internals.pyx [ 8/40] Cythonizing pandas/_libs/interval.pyx [ 9/40] Cythonizing pandas/_libs/join.pyx [10/40] Cythonizing pandas/_libs/lib.pyx [11/40] Cythonizing pandas/_libs/missing.pyx [12/40] Cythonizing pandas/_libs/ops.pyx [13/40] Cythonizing pandas/_libs/ops_dispatch.pyx [14/40] Cythonizing pandas/_libs/parsers.pyx [15/40] Cythonizing pandas/_libs/properties.pyx [16/40] Cythonizing pandas/_libs/reduction.pyx [17/40] Cythonizing pandas/_libs/reshape.pyx [18/40] Cythonizing pandas/_libs/sparse.pyx [19/40] Cythonizing pandas/_libs/testing.pyx [20/40] Cythonizing pandas/_libs/tslib.pyx [21/40] Cythonizing pandas/_libs/tslibs/base.pyx [22/40] Cythonizing pandas/_libs/tslibs/ccalendar.pyx [23/40] Cythonizing pandas/_libs/tslibs/conversion.pyx [24/40] Cythonizing pandas/_libs/tslibs/dtypes.pyx [25/40] Cythonizing pandas/_libs/tslibs/fields.pyx [26/40] Cythonizing pandas/_libs/tslibs/nattype.pyx [27/40] Cythonizing pandas/_libs/tslibs/np_datetime.pyx [28/40] Cythonizing pandas/_libs/tslibs/offsets.pyx [29/40] Cythonizing pandas/_libs/tslibs/parsing.pyx [30/40] Cythonizing pandas/_libs/tslibs/period.pyx [31/40] Cythonizing pandas/_libs/tslibs/strptime.pyx [32/40] Cythonizing pandas/_libs/tslibs/timedeltas.pyx [33/40] Cythonizing pandas/_libs/tslibs/timestamps.pyx [34/40] Cythonizing pandas/_libs/tslibs/timezones.pyx [35/40] Cythonizing pandas/_libs/tslibs/tzconversion.pyx [36/40] Cythonizing pandas/_libs/tslibs/vectorized.pyx [37/40] Cythonizing pandas/_libs/window/aggregations.pyx [38/40] Cythonizing pandas/_libs/window/indexers.pyx [39/40] Cythonizing pandas/_libs/writers.pyx [40/40] Cythonizing pandas/io/sas/sas.pyx /tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/setuptools/init.py:85: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try pip install --use-pep517. dist.fetch_build_eggs(dist.setup_requires) running install [...] x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/testing.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/testing.cpython-39-x86_64-linux-gnu.so building 'pandas._libs.window.aggregations' extension creating build/temp.linux-x86_64-cpython-39/pandas/_libs/window x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -Ipandas/_libs/window -I./pandas/_libs -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/window/aggregations.cpp -o build/temp.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.o -Werror x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.cpython-39-x86_64-linux-gnu.so building 'pandas._libs.window.indexers' extension x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/window/indexers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/window/indexers.o -Werror x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/window/indexers.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/window/indexers.cpython-39-x86_64-linux-gnu.so building 'pandas._libs.writers' extension x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/writers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/writers.o -Werror pandas/_libs/writers.c: In function ‘__pyx_f_6pandas_5_libs_7writers_word_len’: pandas/_libs/writers.c:5092:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations] 5092 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:38: /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here 446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject op) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~ pandas/_libs/writers.c:5092:5: error: ‘PyUnicode_AsUnicode’ is deprecated [-Werror=deprecated-declarations] 5092 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:38: /usr/include/python3.9/cpython/unicodeobject.h:580:45: note: declared here 580 | Py_DEPRECATED(3.3) PyAPI_FUNC(Py_UNICODE ) PyUnicode_AsUnicode( | ^~~~~~~~~~~~~~~~~~~ pandas/_libs/writers.c:5092:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations] 5092 | __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val); | ^~~~~~~~~ In file included from /usr/include/python3.9/unicodeobject.h:1026, from /usr/include/python3.9/Python.h:97, from pandas/_libs/writers.c:38: /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here 446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~ cc1: all warnings being treated as errors error: command '/usr/bin/x86_64-linux-gnu-gcc' failed with exit code 1

----------------------------------------

Command "/home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;file='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-record-aq3utvkw/install-record.txt --single-version-externally-managed --compile --install-headers /home/admin/chainhammer/env/include/site/python3.9/pandas" failed with error code 1 in /tmp/pip-install-t6avqa7g/pandas/ You are using pip version 18.1, however version 23.0.1 is available. You should consider upgrading via the 'pip install --upgrade pip' command. ++ echo

++ ipython kernel install --user --name=Python.3.py3eth scripts/install-virtualenv.sh: line 36: ipython: command not found ++ echo

++ set +x

```

Bump werkzeug from 0.14.1 to 2.2.3

opened on 2023-02-15 20:08:05 by dependabot[bot]

Bumps werkzeug from 0.14.1 to 2.2.3.

Release notes

Sourced from werkzeug's releases.

2.2.3

This is a fix release for the 2.2.x release branch.

This release contains security fixes for:

2.2.2

This is a fix release for the 2.2.0 feature release.

2.2.1

This is a fix release for the 2.2.0 feature release.

2.2.0

This is a feature release, which includes new features and removes previously deprecated features. The 2.2.x branch is now the supported bugfix branch, the 2.1.x branch will become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades.

2.1.2

This is a fix release for the 2.1.0 feature release.

2.1.1

This is a fix release for the 2.1.0 feature release.

2.1.0

This is a feature release, which includes new features and removes previously deprecated features. The 2.1.x branch is now the supported bugfix branch, the 2.0.x branch will become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades.

2.0.3

... (truncated)

Changelog

Sourced from werkzeug's changelog.

Version 2.2.3

Released 2023-02-14

  • Ensure that URL rules using path converters will redirect with strict slashes when the trailing slash is missing. :issue:2533
  • Type signature for get_json specifies that return type is not optional when silent=False. :issue:2508
  • parse_content_range_header returns None for a value like bytes */-1 where the length is invalid, instead of raising an AssertionError. :issue:2531
  • Address remaining ResourceWarning related to the socket used by run_simple. Remove prepare_socket, which now happens when creating the server. :issue:2421
  • Update pre-existing headers for multipart/form-data requests with the test client. :issue:2549
  • Fix handling of header extended parameters such that they are no longer quoted. :issue:2529
  • LimitedStream.read works correctly when wrapping a stream that may not return the requested size in one read call. :issue:2558
  • A cookie header that starts with = is treated as an empty key and discarded, rather than stripping the leading ==.
  • Specify a maximum number of multipart parts, default 1000, after which a RequestEntityTooLarge exception is raised on parsing. This mitigates a DoS attack where a larger number of form/file parts would result in disproportionate resource use.

Version 2.2.2

Released 2022-08-08

  • Fix router to restore the 2.1 strict_slashes == False behaviour whereby leaf-requests match branch rules and vice versa. :pr:2489
  • Fix router to identify invalid rules rather than hang parsing them, and to correctly parse / within converter arguments. :pr:2489
  • Update subpackage imports in :mod:werkzeug.routing to use the import as syntax for explicitly re-exporting public attributes. :pr:2493
  • Parsing of some invalid header characters is more robust. :pr:2494
  • When starting the development server, a warning not to use it in a production deployment is always shown. :issue:2480
  • LocalProxy.__wrapped__ is always set to the wrapped object when the proxy is unbound, fixing an issue in doctest that would cause it to fail. :issue:2485
  • Address one ResourceWarning related to the socket used by run_simple. :issue:2421

... (truncated)

Commits
  • 22a254f release version 2.2.3
  • 517cac5 Merge pull request from GHSA-xg9f-g7g7-2323
  • babc8d9 rewrite docs about request data limits
  • 09449ee clean up docs
  • fe899d0 limit the maximum number of multipart form parts
  • cf275f4 Merge pull request from GHSA-px8h-6qxv-m22q
  • 8c2b4b8 don't strip leading = when parsing cookie
  • 7c7ce5c [pre-commit.ci] pre-commit autoupdate (#2585)
  • 19ae03e [pre-commit.ci] auto fixes from pre-commit.com hooks
  • a83d3b8 [pre-commit.ci] pre-commit autoupdate
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/drandreaskrueger/chainhammer/network/alerts).

Bump ipython from 7.2.0 to 8.10.0

opened on 2023-02-10 23:29:00 by dependabot[bot]

Bumps ipython from 7.2.0 to 8.10.0.

Release notes

Sourced from ipython's releases.

See https://pypi.org/project/ipython/

We do not use GitHub release anymore. Please see PyPI https://pypi.org/project/ipython/

7.9.0

No release notes provided.

7.8.0

No release notes provided.

7.7.0

No release notes provided.

7.6.1

No release notes provided.

7.6.0

No release notes provided.

7.5.0

No release notes provided.

7.4.0

No release notes provided.

7.3.0

No release notes provided.

Commits
  • 15ea1ed release 8.10.0
  • 560ad10 DOC: Update what's new for 8.10 (#13939)
  • 7557ade DOC: Update what's new for 8.10
  • 385d693 Merge pull request from GHSA-29gw-9793-fvw7
  • e548ee2 Swallow potential exceptions from showtraceback() (#13934)
  • 0694b08 MAINT: mock slowest test. (#13885)
  • 8655912 MAINT: mock slowest test.
  • a011765 Isolate the attack tests with setUp and tearDown methods
  • c7a9470 Add some regression tests for this change
  • fd34cf5 Swallow potential exceptions from showtraceback()
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/drandreaskrueger/chainhammer/network/alerts).

Bump certifi from 2018.11.29 to 2022.12.7

opened on 2022-12-08 01:27:01 by dependabot[bot]

Bumps certifi from 2018.11.29 to 2022.12.7.

Commits


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/drandreaskrueger/chainhammer/network/alerts).

Bump eth-account from 0.3.0 to 0.5.9

opened on 2022-08-30 20:21:35 by dependabot[bot]

Bumps eth-account from 0.3.0 to 0.5.9.

Changelog

Sourced from eth-account's changelog.

eth-account v0.5.9 (2022-08-04)

Bugfixes


- fix DoS-able regex pattern (`[#178](https://github.com/ethereum/eth-account/issues/178) <https://github.com/ethereum/eth-account/issues/178>`__)

Miscellaneous changes

  • [#183](https://github.com/ethereum/eth-account/issues/183) <https://github.com/ethereum/eth-account/issues/183>, [#184](https://github.com/ethereum/eth-account/issues/184) <https://github.com/ethereum/eth-account/issues/184>

eth-account v0.5.8 (2022-06-06)

Miscellaneous changes


- `[#163](https://github.com/ethereum/eth-account/issues/163) <https://github.com/ethereum/eth-account/issues/163>`__, `[#168](https://github.com/ethereum/eth-account/issues/168) <https://github.com/ethereum/eth-account/issues/168>`__

eth-account v0.5.7 (2022-01-27)

Features


- Add support for Python 3.9 and 3.10 (`[#139](https://github.com/ethereum/eth-account/issues/139) &lt;https://github.com/ethereum/eth-account/issues/139&gt;`__)


Bugfixes
  • recover_message now raises an eth_keys.exceptions.BadSignature error if the v, r, and s points are invalid ([#142](https://github.com/ethereum/eth-account/issues/142) &lt;https://github.com/ethereum/eth-account/issues/142&gt;__)

eth-account v0.5.6 (2021-09-22)

Features


- An explicit transaction type is no longer required for signing a transaction if we can implicitly determine the transaction type from the transaction parameters (`[#125](https://github.com/ethereum/eth-account/issues/125) &lt;https://github.com/ethereum/eth-account/issues/125&gt;`__)


Bugfixes

</tr></table>

... (truncated)

Commits
  • c0060ca Bump version: 0.5.8 → 0.5.9
  • da84e6d Compile release notes
  • 891ec7c Allow bumpversion to find version (#184)
  • f1ee532 Update mypy version (#183)
  • 70f89be fix redos-able regex and add poc code to tests (#182)
  • d8f88e7 Bump version: 0.5.7 → 0.5.8
  • 424bebc Compile release notes
  • 0b2e5a5 Allow wider range of towncrier again, exclude venv/ files from distribution (...
  • 53400cb Open up bitarray dependency to allow <3 (#163)
  • 40cca90 Bump version: 0.5.6 → 0.5.7
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/drandreaskrueger/chainhammer/network/alerts).

Bump nbconvert from 5.4.0 to 6.5.1

opened on 2022-08-23 17:26:05 by dependabot[bot]

Bumps nbconvert from 5.4.0 to 6.5.1.

Release notes

Sourced from nbconvert's releases.

Release 6.5.1

No release notes provided.

6.5.0

What's Changed

New Contributors

Full Changelog: https://github.com/jupyter/nbconvert/compare/6.4.5...6.5

6.4.3

What's Changed

New Contributors

Full Changelog: https://github.com/jupyter/nbconvert/compare/6.4.2...6.4.3

6.4.0

What's Changed

New Contributors

... (truncated)

Commits


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/drandreaskrueger/chainhammer/network/alerts).
drandreaskrueger

also see: https://gitlab.com/andreaskrueger

GitHub Repository