Compare commits

...

86 commits

Author SHA1 Message Date
Lőrinc
3170e2c162 prevector: store P2WSH/P2TR/P2PK scripts inline
The current `prevector` size of 28 bytes (chosen to fill the `sizeof(CScript)` aligned size) was introduced in 2015 (https://github.com/bitcoin/bitcoin/pull/6914) before SegWit and TapRoot.
However, the increasingly common `P2WSH` and `P2TR` scripts are both 34 bytes, and are forced to use heap (re)allocation rather than efficient inline storage.

The core trade-off of this change is to eliminate heap allocations for common 34-36 byte scripts at the cost of increasing the base memory footprint of all `CScript` objects by 8 bytes (while still respecting peak memory usage defined by `-dbcache`).

Increasing the `prevector` size allows these scripts to be stored inline, avoiding extra heap allocations, reducing potential memory fragmentation, and improving performance during cache flushes. Massif analysis confirms a lower stable memory usage after flushing, suggesting the elimination of heap allocations outweighs the larger base size for common workloads.

Due to memory alignment, increasing the `prevector` size to 36 bytes doesn't change the overall `sizeof(CScript)` compared to an increase to 34 bytes, allowing us to include `P2PK` scripts as well at no additional memory cost.

Performance benchmarks for AssumeUTXO load and flush show:
* Small dbcache (450MB): ~1-3% performance improvement (despite more frequent flushes)
* Large dbcache (4500MB): ~6-8% performance improvement due to fewer heap allocations (and basically the number of flushes)
* Very large dbcache (4500MB): ~5-6% performance improvement due to fewer heap allocations (and memory limit not being reached, so there's no memory penalty)

Full IBD and reindex-chainstate with larger `dbcache` values also show an overall ~2-3% speedup.

Co-authored-by: Ava Chow <github@achow101.com>
Co-authored-by: Andrew Toth <andrewstoth@gmail.com>
2025-04-29 11:40:56 +02:00
Lőrinc
ecc6c07e58 test: assert CScript allocation characteristics
Verifies that script types are correctly allocated using prevector's direct or indirect storage based on their size:

Direct allocated script types (size ≤ 28 bytes):
* OP_RETURN (small)
* P2WPKH
* P2SH
* P2PKH

Indirect allocated script types (size > 28 bytes):
* P2WSH
* P2TR
* P2PK
* MULTISIG (small)

This test provides a baseline for verifying changes to prevector's inline capacity.
2025-04-29 11:40:54 +02:00
Lőrinc
a5f2ffa951 refactor: extract PREVECTOR_SIZE to header constant
Co-authored-by: Luke Dashjr <luke-jr+git@utopios.org>
Co-authored-by: kuegi <29012906+kuegi@users.noreply.github.com>
2025-04-29 11:32:47 +02:00
merge-script
c5e44a0435
Merge bitcoin/bitcoin#32369: test: Use the correct node for doubled keypath test
Some checks are pending
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
32d55e28af test: Use the correct node for doubled keypath test (Ava Chow)

Pull request description:

  #29124 had a silent merge conflict with #32350 which resulted in it using the wrong node. Fix the test to use the correct v22 node.

ACKs for top commit:
  maflcko:
    lgtm ACK 32d55e28af
  rkrux:
    ACK 32d55e28af
  BrandonOdiwuor:
    Code Review ACK 32d55e28af

Tree-SHA512: 1e0231985beb382b16e1d608c874750423d0502388db0c8ad450b22d17f9d96f5e16a6b44948ebda5efc750f62b60d0de8dd20131f449427426a36caf374af92
2025-04-29 09:59:42 +01:00
Ava Chow
32d55e28af test: Use the correct node for doubled keypath test 2025-04-28 14:44:17 -07:00
Ava Chow
65714c162c
Merge bitcoin/bitcoin#32327: test: Add missing check for empty stderr in util tester
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
fadf12a56c test: Add missing check for empty stderr in util tester (MarcoFalke)

Pull request description:

  Now that wine support was removed from the CI in 25b56fd9b4, it can probably be removed from the util tester as well.

  If someone really needs this, they can comment the new check out, or submit a patch to add an option/env var to silence the new check.

ACKs for top commit:
  achow101:
    ACK fadf12a56c
  i-am-yuvi:
    tACK fadf12a56c
  BrandonOdiwuor:
    Code Review ACK fadf12a56c
  ismaelsadeeq:
    Tested ACK fadf12a56c

Tree-SHA512: d9e4d7a7f724e114391070ea7f17b585a7e4c4f3221c3bf510eeb11df6ccd089b881ab5654adfef8d3a1f8fa7ec6bf5e3a3feeb0cdfe724a8f3e5a146c388e66
2025-04-28 14:43:32 -07:00
merge-script
a4eee6d50b
Merge bitcoin/bitcoin#29124: test: Test that migration automatically repairs corrupted metadata with doubled derivation path
c7e2b9e264 tests: Test migration cleans up bad inactive chain derivation path (Ava Chow)

Pull request description:

  A bug in 0.21.x and 22.x resulted in some wallets having invalid derivation paths that are the concatenation of two derivation paths. These appear only when inactive hd chains are topped up.

  Since key metadata is a legacy wallet only record, migrating legacy wallets to descriptor wallets will fix this issue as all key metadata records are deleted. The derivation path information is derived on-the-fly from the descriptor that is produced for the inactive hd chain.

  Thus we only need a test to verify that the derivation paths are good, and that all key metadata records are deleted from the migrated wallet.

ACKs for top commit:
  murchandamus:
    re-ACK c7e2b9e264 via range-diff:
  rkrux:
    re-ACK c7e2b9e264
  furszy:
    utACK c7e2b9e264

Tree-SHA512: 3117c4a43798972109fe2d3539341a8b69db70c6457fcabdd019e6044834dc4b17212abbc006d7b8008f560dce4b7856142b057981b9404f406d58fa0955cbd9
2025-04-28 17:13:42 -04:00
Ava Chow
af6cffa36d
Merge bitcoin/bitcoin#32350: test: Slim down previous releases bdb check
fa58f40b89 test: Slim down previous releases bdb check (MarcoFalke)

Pull request description:

  The check iterates over several previous BDB-only releases to check that descriptor wallets are considered "corrupt" when loading. It is unclear why this needs to be done for more than one release.

  Avoid the confusion by removing the unused releases from the test and from the download script.

ACKs for top commit:
  achow101:
    ACK fa58f40b89
  rkrux:
    ACK fa58f40b89

Tree-SHA512: 8084392481bfe1fba9b80bb865ffbdfa454e9e6e14e02c39fa3f61c1a596b1def2c531c5da1c7566e5fddb77ac7e56f19feabaaf9b5af043fa6c230d9e2370b5
2025-04-28 12:56:22 -07:00
Ava Chow
33e6538b30
Merge bitcoin/bitcoin#32360: test: Force named args for RPCOverloadWrapper optional args
fa48be3ba4 test: Force named args for RPCOverloadWrapper optional args (MarcoFalke)
aaaa45399c test: Remove unused createwallet_passthrough (MarcoFalke)
cccc1f4e91 test: Remove unused RPCOverloadWrapper is_cli field (MarcoFalke)

Pull request description:

  This can avoid bugs and makes the test code easier to read, because the
  order of positional args does not have to be known or assumed.

  Also, contains two commits to remove dead code.

ACKs for top commit:
  achow101:
    ACK fa48be3ba4
  rkrux:
    tACK fa48be3ba4
  janb84:
    tACK [fa48be3](fa48be3ba4)

Tree-SHA512: d938fbc18be5035ad0d0e1ad2bf7297b2b66ede3bb2d3f10b8d27aa2a19d27a897b024a5f5a2a1cceca467837890729c26054928cb06acbe282b9e9eea94ae69
2025-04-28 12:41:00 -07:00
merge-script
3a29ba33dc
Merge bitcoin/bitcoin#32357: depends: Fix cross-compiling qt package from macOS to Windows
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
35e57fbe33 depends: Fix cross-compiling `qt` package from macOS to Windows (Hennadii Stepanov)

Pull request description:

  Native packages cannot be used during cross-compiling. However, Qt still unconditionally tries to find them, which causes issues in some cases, such as when [cross-compiling from macOS to Windows](https://github.com/bitcoin/bitcoin/issues/32346).

  This PR explicitly disables this unnecessary Qt behaviour.

  Fixes https://github.com/bitcoin/bitcoin/issues/32346.

  Here is a full workflow on my macOS Sequoia 15.4.1 (Intel):
  ```
  % brew install make cmake ninja mingw-w64 nsis
  % gmake -C depends -j 10 HOST=x86_64-w64-mingw32
  % cmake -B build --toolchain depends/x86_64-w64-mingw32/toolchain.cmake
  % cmake --build build -j 10 -t deploy
  ```

ACKs for top commit:
  shahsb:
    ACK 35e57fbe33
  fanquake:
    ACK 35e57fbe33

Tree-SHA512: 2822fb49bc84dd094dbd189d8a9ca0f023e1e48127db7beaefb9db92de53df63bb0f399c9c430c33941f9a9ee6976b9161d80467d889f7717385b9d1ea9fee43
2025-04-28 16:15:13 +01:00
MarcoFalke
fa48be3ba4
test: Force named args for RPCOverloadWrapper optional args
This can avoid bugs and makes the test code easier to read, because the
order of positional args does not have to be known or assumed.
2025-04-28 15:15:05 +02:00
MarcoFalke
aaaa45399c
test: Remove unused createwallet_passthrough 2025-04-28 15:14:52 +02:00
merge-script
f409444d02
Merge bitcoin/bitcoin#32071: build: Drop option to disable hardening.
77e553ab6a build: refactor: hardening flags -> core_interface (David Gumberg)
00ba3ba303 build: Drop option for disabling hardening (David Gumberg)
f57db75e91 build: Use `-z noseparate-code` on NetBSD < 11.0 (David Gumberg)

Pull request description:

  Follow up to #32038 which dropped `NO_HARDEN` from depends builds, this PR drops the `ENABLE_HARDENING` build option since disabling hardening of binaries should not be a supported or maintained use case. With this change, hardening flags are always enabled.

  Individual hardening flags and options can still be disabled by appending flags, e.g.:

  ```bash
  cmake -B build \
    -DAPPEND_CPPFLAGS='-U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=0 -fno-stack-protector -fcf-protection=none -fno-stack-clash-protection' \
    -DAPPEND_LDFLAGS='-Wl,-z,lazy -Wl,-z,norelro -Wl,-z,noseparate-code'
  ```

  There is an issue with NetBSD 10.0's dynamic linker that makes one of the hardening linker flags, `-z separate-code`, [problematic](https://github.com/bitcoin/bitcoin/pull/28724#issuecomment-2589347934), so this PR also introduces a check to prevent the use of this flag in NetBSD versions < 11.0, (where this issue is [fixed](acf7fb3abf)). The fix for this [might be backported](https://mail-index.netbsd.org/tech-userlevel/2023/01/05/msg013670.html) to NetBSD 10.0.

  I suggest reviewing the diff with whitespace changes hidden (`git diff -w` or using github's hide whitespace option)

ACKs for top commit:
  hebasto:
    re-ACK 77e553ab6a.
  laanwj:
    re-ACK 77e553ab6a
  janb84:
    ACK [77e553a](77e553ab6a)
  vasild:
    ACK 77e553ab6a
  musaHaruna:
    tested ACK [77e553](77e553ab6a)

Tree-SHA512: b149fb0371d12312c140255bf674c2bdc9f5272a5750a5b9ec5f192323364bb2ea8e164af13b9ab981ab3aa7ceb91b7a64785081e7458470e81c2f5228abf7b1
2025-04-28 13:32:16 +01:00
MarcoFalke
cccc1f4e91
test: Remove unused RPCOverloadWrapper is_cli field 2025-04-28 10:18:59 +02:00
Hennadii Stepanov
d62c2d82e1
Merge bitcoin/bitcoin#32353: doc: Fix fuzz test_runner.py path
Some checks failed
CI / test each commit (push) Has been cancelled
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Has been cancelled
CI / macOS 14 native, arm64, fuzz (push) Has been cancelled
CI / Windows native, VS 2022 (push) Has been cancelled
CI / Windows native, fuzz, VS 2022 (push) Has been cancelled
CI / Linux->Windows cross, no tests (push) Has been cancelled
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Has been cancelled
CI / Windows, test cross-built (push) Has been cancelled
61f238e84a doc: Fix fuzz test_runner.py path (monlovesmango)

Pull request description:

  This commit fixes the path listed in the documentation for the fuzz testing test_runner.py. Previously the --help option worked but running fuzz tests from the documented path did not.

ACKs for top commit:
  kevkevinpal:
    ACK [61f238e](61f238e84a)
  maflcko:
    lgtm ACK 61f238e84a
  mabu44:
    Tested ACK 61f238e84a
  hebasto:
    ACK 61f238e84a.

Tree-SHA512: e8770f38e49a428e0e7f0450db193ec90cc1e66c05bcde307763c065ac7051f3f05923bb3e0eca7a337da9c14cfd17512ff0d01ffa330796159d4f3552103b7f
2025-04-27 16:16:21 +01:00
Hennadii Stepanov
35e57fbe33 depends: Fix cross-compiling qt package from macOS to Windows 2025-04-26 17:13:16 +01:00
Hennadii Stepanov
d2ac748e9e
Merge bitcoin-core/gui#864: Crash fix, disconnect numBlocksChanged() signal during shutdown
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
71656bdfaa gui: crash fix, disconnect numBlocksChanged() signal during shutdown (furszy)

Pull request description:

  Aiming to fix bitcoin-core/gui#862.

  The crash stems from the order of the shutdown procedure:
  We first unset the client model, then destroy the wallet controller—but we leave
  the internal wallet models (`m_wallets`) untouched for a brief period. As a result,
  there’s a point in time where views still have connected signals and access to
  wallet models that are not connected to any wallet controller.
  Now.. since the `clientModel` is only replaced with nullptr locally and not destroyed
  yet, signals like `numBlocksChanged` can still emit. Thus, when wallet views receive
  them, they see a non-null wallet model ptr, and proceed to call backend functions
  from a model that is being torn down.

  As the shutdown procedure begins by unsetting `clientModel` from all views. It’s safe
  to ignore events when `clientModel` is nullptr.

ACKs for top commit:
  maflcko:
    lgtm ACK 71656bdfaa
  pablomartin4btc:
    re-ACK 71656bdfaa
  hebasto:
    ACK 71656bdfaa, I have reviewed the code and it looks OK.

Tree-SHA512: e6a369c40aad8a5a3da64e92daa10250006f60c53feef353a5580e1bdb17fe8e1ad102abf5419ddeff1caa703b69ab634265ef3b9cfef87e9304f97bfdd2c4aa
2025-04-26 13:45:31 +01:00
Hennadii Stepanov
de90b47ea0
Merge bitcoin-core/gui#868: Replace stray tfm::format to cerr with qWarning
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
edd46566bd qt: Replace stray tfm::format to cerr with qWarning (laanwj)

Pull request description:

  GUI warnings should go to the log, not to the console (which may not be connected at all).

ACKs for top commit:
  hebasto:
    ACK edd46566bd, I have reviewed the code and it looks OK.

Tree-SHA512: 32944e00dae0c62bb23e3d7abd486b63e445702483ca03c74c3057ef942f06e771d4d3d3a58fd728582889d6b638fae11ecc536a25febfd89a28522b7d6d08ba
2025-04-26 07:56:23 +01:00
monlovesmango
61f238e84a doc: Fix fuzz test_runner.py path
This commit fixes the path listed in the documentation for the fuzz
testing test_runner.py. Previously the --help option worked but running
fuzz tests from the documented path did not.
2025-04-25 16:33:58 +00:00
Ava Chow
c7e2b9e264 tests: Test migration cleans up bad inactive chain derivation path
A bug in 0.21.x and 22.x resulted in some wallets having invalid
derivation paths that are the concatenation of two derivation paths.
These appear only when inactive hd chains are topped up.

Since key metadata is a legacy wallet only record, migrating legacy
wallets to descriptor wallets will fix this issue as all key metadata
records are deleted. The derivation path information is derived
on-the-fly from the descriptor that is produced for the inactive hd
chain.

Thus we only need a test to verify that the derivation paths are good,
and that all key metadata records are deleted from the migrated wallet.
2025-04-25 09:12:53 -07:00
MarcoFalke
fa58f40b89
test: Slim down previous releases bdb check 2025-04-25 16:04:07 +02:00
merge-script
80e6ad9e30
Merge bitcoin/bitcoin#31250: wallet: Disable creating and loading legacy wallets
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
17bb63f9f9 wallet: Disallow loading legacy wallets (Ava Chow)
9f04e02ffa wallet: Disallow creating legacy wallets (Ava Chow)
6b247279b7 wallet: Disallow legacy wallet creation from the wallet tool (Ava Chow)
5e93b1fd6c bench: Remove WalletLoadingLegacy benchmark (Ava Chow)
56f959d829 wallet: Remove wallettool salvage (Ava Chow)
7a41c939f0 wallet: Remove -format and bdb from wallet tool's createfromdump (Ava Chow)
c847dee148 test: remove legacy wallet functional tests (Ava Chow)
20a9173717 test: Remove legacy wallet tests from wallet_reindex.py (Ava Chow)
446d480cb2 test: Remove legacy wallet tests from wallet_backwards_compatibility.py (Ava Chow)
aff80298d0 test: wallet_signer.py bdb will be removed (Ava Chow)
f94f9399ac test: Remove legacy wallet unit tests (Ava Chow)
d9ac9dbd8e tests, gui: Use descriptors watchonly wallet for watchonly test (Ava Chow)

Pull request description:

  To prepare for the deletion of legacy wallet code, disable creating or loading new legacy wallets.

  Tests for the legacy wallet specifically are deleted.

  Split from https://github.com/bitcoin/bitcoin/pull/28710

ACKs for top commit:
  Sjors:
    re-ACK 17bb63f9f9
  pablomartin4btc:
    re-ACK 17bb63f9f9
  laanwj:
    re-ACK 17bb63f9f9

Tree-SHA512: d7a86df1f71f12451b335f22f7c3f0394166ac3f8f5b81f6bbf0321026e2e8ed621576656c371d70e202df1be4410b2b1c1acb5d5f0c341e7b67aaa0ac792e7c
2025-04-25 13:11:24 +01:00
merge-script
971952588d
Merge bitcoin/bitcoin#32242: guix: Remove unused file package
513e2020a9 guix: Remove unused `file` package (Hennadii Stepanov)

Pull request description:

  The `file` utility has not been required since Guix builds were introduced in https://github.com/bitcoin/bitcoin/pull/15277.

  My Guix build:
  ```
  feeb8da87994724878ac8a62a6e9bfb9b1ece855f336327c900214d84d0a2bb9  guix-build-513e2020a9ac/output/aarch64-linux-gnu/SHA256SUMS.part
  216b287ffbff054a14e60e14b3376acb8cd41dd224223ecd508bd46efad020ba  guix-build-513e2020a9ac/output/aarch64-linux-gnu/bitcoin-513e2020a9ac-aarch64-linux-gnu-debug.tar.gz
  3e68be5a329cbf5f389ca0dde51c1b1dd16b63ec26f42b8aa97dbd455ee3f30e  guix-build-513e2020a9ac/output/aarch64-linux-gnu/bitcoin-513e2020a9ac-aarch64-linux-gnu.tar.gz
  da2d239a78e7ceaa944bd20d15118577daf5cb3817cde4bfe39ea014707326bd  guix-build-513e2020a9ac/output/arm-linux-gnueabihf/SHA256SUMS.part
  0c05b2ab0cd048d4b3b66b97f433895ca0258a68485d641a1c7aaec198114a59  guix-build-513e2020a9ac/output/arm-linux-gnueabihf/bitcoin-513e2020a9ac-arm-linux-gnueabihf-debug.tar.gz
  8cf7ff862c9eccfa7ebfb573674485082b5b797f0a8e9ba058156d3131b144fd  guix-build-513e2020a9ac/output/arm-linux-gnueabihf/bitcoin-513e2020a9ac-arm-linux-gnueabihf.tar.gz
  9464457cb7832b1805a25bb86cdc7c8670dd3c02333e229470d5348c587a73b8  guix-build-513e2020a9ac/output/arm64-apple-darwin/SHA256SUMS.part
  21f0b343cfd5a85f5b563e92931b254874a6dfad21eec91d40b485320ea37a27  guix-build-513e2020a9ac/output/arm64-apple-darwin/bitcoin-513e2020a9ac-arm64-apple-darwin-codesigning.tar.gz
  e34a91173eae3b5ebb6dfa5ca39aa5bb782a2bfd30331edfa0075edda766ac7b  guix-build-513e2020a9ac/output/arm64-apple-darwin/bitcoin-513e2020a9ac-arm64-apple-darwin-unsigned.tar.gz
  6c42015d14fb82f124450498299db15147ee3d93d5192f2fa45d700db9ef0fb9  guix-build-513e2020a9ac/output/arm64-apple-darwin/bitcoin-513e2020a9ac-arm64-apple-darwin-unsigned.zip
  ddd176d8fcf325c23cb1d9c17933f651b26ce3ef13d1c4962460b3bfdb9fd7ff  guix-build-513e2020a9ac/output/dist-archive/bitcoin-513e2020a9ac.tar.gz
  92af97145e0dbeac11a49da48e567f3ee6e08c88c17aa434f10dc4ae13125a57  guix-build-513e2020a9ac/output/powerpc64-linux-gnu/SHA256SUMS.part
  fe980f2d4a6a6c567f447bd86221887ea3f9dcceb14a42e20f9cb8682ff66efe  guix-build-513e2020a9ac/output/powerpc64-linux-gnu/bitcoin-513e2020a9ac-powerpc64-linux-gnu-debug.tar.gz
  41c9f8c46acafec1c80fc1727eddeb8e0373ef06ffc221b16af82b6d61b9c49a  guix-build-513e2020a9ac/output/powerpc64-linux-gnu/bitcoin-513e2020a9ac-powerpc64-linux-gnu.tar.gz
  42051e97efd6178630c65bd398db075342d2c4722f97ec774b4e506b28f7e83b  guix-build-513e2020a9ac/output/riscv64-linux-gnu/SHA256SUMS.part
  f136a82ee58d10db1360d3aa00e695c8e4017db483fe84080e8050efe4436042  guix-build-513e2020a9ac/output/riscv64-linux-gnu/bitcoin-513e2020a9ac-riscv64-linux-gnu-debug.tar.gz
  78d23a7b622e894fa150706a5981ef0bf5ec35b93caaaafdd4aa95c76689ca14  guix-build-513e2020a9ac/output/riscv64-linux-gnu/bitcoin-513e2020a9ac-riscv64-linux-gnu.tar.gz
  7faaa7151d85482a48c73416f0efc124f92fdf26ef232ec0dc9aa131661b2aa8  guix-build-513e2020a9ac/output/x86_64-apple-darwin/SHA256SUMS.part
  13b1bb0026c2441903210f13e61be6220abb75f50dd775bb0bea3bc388ed047c  guix-build-513e2020a9ac/output/x86_64-apple-darwin/bitcoin-513e2020a9ac-x86_64-apple-darwin-codesigning.tar.gz
  80296525f4c797b24e0ac9617ef1e3d4edbce638be3dee823face9554ba9f373  guix-build-513e2020a9ac/output/x86_64-apple-darwin/bitcoin-513e2020a9ac-x86_64-apple-darwin-unsigned.tar.gz
  58a2a3f7335a61c68f7023aa6e2668a958ea2a4acf241c0c9968866ca5da1a21  guix-build-513e2020a9ac/output/x86_64-apple-darwin/bitcoin-513e2020a9ac-x86_64-apple-darwin-unsigned.zip
  511e2f63b1f25e7c4309d8328cd9fa09b8b65dbcce8124334f89445f127f1fd2  guix-build-513e2020a9ac/output/x86_64-linux-gnu/SHA256SUMS.part
  6458af46a7960b668651c59a2a0fef9d44c3c17489fad06f92ca2c43a60b2b23  guix-build-513e2020a9ac/output/x86_64-linux-gnu/bitcoin-513e2020a9ac-x86_64-linux-gnu-debug.tar.gz
  f259a3364bcc3a35346498fb224e061d37d30555aa263fe74bc45e701fb4c9a6  guix-build-513e2020a9ac/output/x86_64-linux-gnu/bitcoin-513e2020a9ac-x86_64-linux-gnu.tar.gz
  246ea6cb2218518c490f9a92bcd1dcb5589ed50f11b3713e4d2721990e655e39  guix-build-513e2020a9ac/output/x86_64-w64-mingw32/SHA256SUMS.part
  beb17556e14916e2af8463090c4f0787e16721961a88ef54975b61dde36d1503  guix-build-513e2020a9ac/output/x86_64-w64-mingw32/bitcoin-513e2020a9ac-win64-codesigning.tar.gz
  85a46d84c0433a50410f1929babc1b22721e8ee7853e57ec5370a072c05146dc  guix-build-513e2020a9ac/output/x86_64-w64-mingw32/bitcoin-513e2020a9ac-win64-debug.zip
  b7191d93cb1524cfd610cee6844119739b9301550b99babb0cefe10546e53445  guix-build-513e2020a9ac/output/x86_64-w64-mingw32/bitcoin-513e2020a9ac-win64-setup-unsigned.exe
  f6615296b46fd064e02fa9b4007e5d59c255595530fc91a8303e417579a7fe80  guix-build-513e2020a9ac/output/x86_64-w64-mingw32/bitcoin-513e2020a9ac-win64-unsigned.zip
  ```

ACKs for top commit:
  janb84:
    Code review and Tested ACK [513e202](513e2020a9)
  fanquake:
    ACK 513e2020a9

Tree-SHA512: 9c684ad473ba6b298dd50c6fb2fb274aa83f5fbe54f6c3d12cd3b30cbc3f77390966ce9ac3be3f675948642743bac4fb45dcda661400bdfde4f347d300478a2a
2025-04-25 10:54:29 +01:00
merge-script
ff69046e66
Merge bitcoin/bitcoin#32215: depends: Fix cross-compiling on macOS
d0cce4172c depends: Fix `mv` command compatibility with macOS (Hennadii Stepanov)
690f5da15a depends: Specify Objective C/C++ compilers for `native_qt` package (Hennadii Stepanov)

Pull request description:

  This PR:
  1. Specifies Objective C/C++ compilers for `native_qt` package.
  2. Removes the `-t` option, which is incompatible with `mv` on macOS.

  Fixes https://github.com/bitcoin/bitcoin/issues/32208.

ACKs for top commit:
  janb84:
    ACK [d0cce41](d0cce4172c)
  fanquake:
    ACK d0cce4172c

Tree-SHA512: a224407f393cc9a90c73ce156674ec90ed74f59104849b312a993218e28f76c3f97335eed6bd5a3e552fd50002a59aa2de775d8ed7557a74c25202a638bfda8c
2025-04-25 10:15:04 +01:00
Ava Chow
4eee328a98
Merge bitcoin/bitcoin#32318: Fix failing util_time_GetTime test on Windows
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
3dbd50a576 Fix failing util_time_GetTime test on Windows (VolodymyrBg)

Pull request description:

  Remove unreliable steady clock time checking from the test that was causing CI failures primarily on Windows. The test previously tried to verify that  steady_clock time increases after a 1ms sleep, but this approach is not reliable on all platforms where such a short sleep interval may not consistently result in observable clock changes.

  This addresses issue #32197 where the test was reporting failures in the  cross-built Windows CI environment. As noted in the discussion, the test is not critical to the functionality of Bitcoin Core, and removing the unreliable part is the most straightforward solution.

ACKs for top commit:
  maflcko:
    lgtm ACK 3dbd50a576
  achow101:
    ACK 3dbd50a576
  laanwj:
    re-ACK 3dbd50a576

Tree-SHA512: 25c80558d9587c7845d3c14464e8d263c8bd9838a510faf44926e5cda5178aee10b03a52464246604e5d27544011d936442ecfa1e4cdaacb66d32c35f7213902
2025-04-24 15:10:04 -07:00
furszy
71656bdfaa
gui: crash fix, disconnect numBlocksChanged() signal during shutdown
The crash stems from the order of the shutdown procedure:
We first unset the client model, then destroy the wallet controller—but we leave
the internal wallet models ('m_wallets') untouched for a brief period. As a result,
there’s a point in time where views still have connected signals and access to
wallet models that are not connected to any wallet controller.
Now.. since the clientModel is only replaced with nullptr locally and not destroyed
yet, signals like numBlocksChanged can still emit. Thus, when wallet views receive
them, they see a non-null wallet model ptr, and proceed to call backend functions
from a model that is being torn down.

As the shutdown procedure begins by unsetting clientModel from all views. It’s safe
to ignore events when clientModel is nullptr.
2025-04-24 15:33:00 -04:00
VolodymyrBg
3dbd50a576 Fix failing util_time_GetTime test on Windows
Remove unreliable steady clock time checking from the test that was causing
CI failures primarily on Windows. The test previously tried to verify that
steady_clock time increases after a 1ms sleep, but this approach is not reliable
on all platforms where such a short sleep interval may not consistently result
in observable clock changes.

This addresses issue #32197 where the test was reporting failures in the
cross-built Windows CI environment. As noted in the discussion, the test is not
critical to the functionality of Bitcoin Core, and removing the unreliable part
is the most straightforward solution.

Rename and refocus util_time_GetTime test to util_mocktime

Co-Authored-By: maflcko <6399679+maflcko@users.noreply.github.com>
2025-04-24 16:35:02 +03:00
laanwj
edd46566bd qt: Replace stray tfm::format to cerr with qWarning
GUI warnings should go to the log, not to the console (which may not be
connected at all).
2025-04-24 12:13:14 +02:00
Hennadii Stepanov
458720e5e9
Merge bitcoin/bitcoin#32336: test: Suppress upstream -Wduplicate-decl-specifier in bpfcc
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
facb9b327b scripted-diff: Use bpf_cflags (MarcoFalke)
fa0c1baaf8 test: Add imports for util bpf_cflags (MarcoFalke)

Pull request description:

  On some Linux kernel versions, the bpf compiler invoked in the functional tests will issue a `-Wduplicate-decl-specifier` warning.

  This seems harmless and should be fixed upstream in the Linux kernel.

  Here, simply suppress it for now. Fixes https://github.com/bitcoin/bitcoin/issues/32322

ACKs for top commit:
  laanwj:
    Code review ACK facb9b327b
  hebasto:
    ACK facb9b327b, I have reviewed the code and it looks OK.

Tree-SHA512: 53387127e3c2a2dbfe05281b2d2e61efbd3c3adcc3b4bf2f11540042f86e1e8c06637f80d246310bc44ca0612318472f25545c1e1ca3636fda97d04381f9e905
2025-04-24 10:53:11 +01:00
MarcoFalke
facb9b327b
scripted-diff: Use bpf_cflags
-BEGIN VERIFY SCRIPT-

 ren() { sed --regexp-extended -i "s/$1/$2/g" $( git grep --extended-regexp -l "$1" ) ; }

 ren 'cflags=\["-Wno-error=implicit-function-declaration"\]' 'cflags=bpf_cflags()'

-END VERIFY SCRIPT-
2025-04-24 10:47:57 +02:00
MarcoFalke
fa0c1baaf8
test: Add imports for util bpf_cflags
This is required for the next commit.
2025-04-24 10:47:44 +02:00
Ava Chow
9efe546688
Merge bitcoin/bitcoin#31835: validation: set BLOCK_FAILED_CHILD correctly
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
3c3548a70e validation: clarify final |= BLOCK_FAILED_VALID in InvalidateBlock (Matt Corallo)
aac5488909 validation: correctly update BlockStatus for invalid block descendants (stratospher)
9e29653b42 test: check BlockStatus when InvalidateBlock is used (stratospher)
c99667583d validation: fix traversal condition to mark BLOCK_FAILED_CHILD (stratospher)

Pull request description:

  This PR addresses 3 issues related to how `BLOCK_FAILED_CHILD` is set:
  1. In `InvalidateBlock()`
  - Previously, `BLOCK_FAILED_CHILD` was not being set when it should have been.
  - This was due to an incorrect traversal condition, which is fixed in this PR.

  2. In `SetBlockFailure()`
  - `BLOCK_FAILED_VALID` is now cleared before setting `BLOCK_FAILED_CHILD`.

  3. In `InvalidateBlock()`
  - if block is already marked as `BLOCK_FAILED_CHILD`, don't mark it as `BLOCK_FAILED_VALID` again.

  Also adds a unit test to check `BLOCK_FAILED_VALID` and `BLOCK_FAILED_CHILD` status in `InvalidateBlock()`.

  <details>
  <summary><h3>looking for feedback on an alternate approach</h3></summary>
  <br>

  An alternate approach could be removing `BLOCK_FAILED_CHILD` since even though we have a distinction between
  `BLOCK_FAILED_VALID` and `BLOCK_FAILED_CHILD` in the codebase, we don't use it for anything. Whenever we check for BlockStatus, we use `BLOCK_FAILED_MASK` which encompasses both of them. See  similar discussion in https://github.com/bitcoin/bitcoin/pull/16856.

  I have a branch with this approach in https://github.com/stratospher/bitcoin/commits/2025_02_remove_block_failed_child/.
  Compared to the version in #16856, it also resets `BLOCK_FAILED_CHILD` already on disk to `BLOCK_FAILED_VALID` when loading from disk so that we won't be in a dirty state in a no-`BLOCK_FAILED_CHILD`-world.

  I'm not sure if it's a good idea to remove `BLOCK_FAILED_CHILD` though. would be curious to hear what others think of this approach.

  thanks @ mzumsande for helpful discussion regarding this PR!
  </details>

ACKs for top commit:
  achow101:
    ACK 3c3548a70e
  TheCharlatan:
    Re-ACK 3c3548a70e
  mzumsande:
    re-ACK 3c3548a70e

Tree-SHA512: 83e0d29dea95b97519d4868135c965b86f6f43be50b15c0bd8f998b3476388fc7cc22b49c0c54ec532ae8222e57dfc436438f0c8e98f54757b384f220488b6a6
2025-04-23 14:09:56 -07:00
Ava Chow
bd158ab4e3
Merge bitcoin/bitcoin#32023: wallet: removed duplicate call to GetDescriptorScriptPubKeyMan
55b931934a removed duplicate calling of GetDescriptorScriptPubKeyMan (Saikiran)

Pull request description:

  Removed duplicate call to GetDescriptorScriptPubKeyMan and
  Instead of checking linearly I have used find method so time complexity reduced significantly for GetDescriptorScriptPubKeyMan
  after this fix improved performance of importdescriptor part refs https://github.com/bitcoin/bitcoin/issues/32013.

  **Steps to reproduce in testnet environment**

  **Input size:** 2 million address in the wallet

  **Step1:** call importaddresdescriptor rpc method
  observe the time it has taken.

  **With the provided fix:**
  Do the same steps again
  observe the time it has taken.

  There is a huge improvement in the performance. (previously it may take 5 to 6 seconds now it will take 1 seconds or less)

  main changes i've made during this pr:

  1. remove duplicate call to GetDescriptorScriptPubKeyMan method
  2. And inside GetDescriptorScriptPubKeyMan method previously we checking **each address linearly** so each time it is calling HasWallet method which has aquired lock.
  3. Now i've modified this logic call **find method on the map (O(logn)**) time it is taking, so only once we calling HasWallet method.

  **Note:** Smaller inputs in the wallet you may not see the issue but huge wallet size it will definitely impact the performance.

ACKs for top commit:
  achow101:
    ACK 55b931934a
  w0xlt:
    ACK 55b931934a

Tree-SHA512: 4a7fdbcbb4e55bd034e9cf28ab4e7ee3fb1745fc8847adb388c98a19c952a1fb66d7b54f0f28b4c2a75a42473923742b4a99fb26771577183a98e0bcbf87a8ca
2025-04-23 13:51:48 -07:00
Ava Chow
17bb63f9f9 wallet: Disallow loading legacy wallets
Legacy wallets do not have the descriptors flag set. Don't load wallets
without the descriptors flag.

At the same time, we will no longer load BDB databases since they are
only used for legacy wallets.
2025-04-23 12:11:56 -07:00
Ava Chow
9f04e02ffa wallet: Disallow creating legacy wallets
Remove the option to set descriptors=False when creating a wallet, and
enforce this in RPC and in CreateWallet
2025-04-23 12:11:56 -07:00
Ava Chow
6b247279b7 wallet: Disallow legacy wallet creation from the wallet tool 2025-04-23 12:10:30 -07:00
Ava Chow
5e93b1fd6c bench: Remove WalletLoadingLegacy benchmark 2025-04-23 12:10:30 -07:00
Ava Chow
56f959d829 wallet: Remove wallettool salvage
Salvage is bdb only which is about to be removed.
2025-04-23 12:10:30 -07:00
Ava Chow
7a41c939f0 wallet: Remove -format and bdb from wallet tool's createfromdump 2025-04-23 12:10:30 -07:00
Ava Chow
c847dee148 test: remove legacy wallet functional tests
Removes all legacy wallet specific functional tests.

Also removes the --descriptor and --legacy-wallet options as these are
no longer necessary with the legacy wallet removed.
2025-04-23 12:10:30 -07:00
Ava Chow
20a9173717 test: Remove legacy wallet tests from wallet_reindex.py 2025-04-23 12:10:30 -07:00
Ava Chow
446d480cb2 test: Remove legacy wallet tests from wallet_backwards_compatibility.py 2025-04-23 12:10:30 -07:00
Ava Chow
aff80298d0 test: wallet_signer.py bdb will be removed 2025-04-23 12:09:38 -07:00
Ava Chow
f94f9399ac test: Remove legacy wallet unit tests 2025-04-23 12:09:38 -07:00
Ava Chow
d9ac9dbd8e tests, gui: Use descriptors watchonly wallet for watchonly test 2025-04-23 12:09:38 -07:00
Hennadii Stepanov
9a4c92eb9a
Merge bitcoin/bitcoin#32226: ci: switch to LLVM 20 in tidy job
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
08aa7fe232 ci: clang-tidy 20 (fanquake)
2b85d31bcc refactor: starts/ends_with changes for clang-tidy 20 (fanquake)

Pull request description:

  Switch to LLVM 20 in the tidy job.

ACKs for top commit:
  l0rinc:
    ACK 08aa7fe232
  hebasto:
    ACK 08aa7fe232.

Tree-SHA512: 54b6c64adcf7556edf3b30f87935de7868354e8ad252da834796f347a5a77feda01f145f17e5a7419cf6f3b4f87fc2b168c1ec2a2d13bb4e0ffcc0fac667fd42
2025-04-23 13:35:43 +01:00
merge-script
82d1e94838
Merge bitcoin/bitcoin#32310: test: Run all benchmarks in the sanity check
faca46b042 test: Run all benchmarks in the sanity check (MarcoFalke)

Pull request description:

  It is unclear why not all benchmarks are run, given that:

  * they only run as a sanity check (fastest version)
  * no one otherwise runs them, not even CI
  * issues have been missed due to this

ACKs for top commit:
  l0rinc:
    ACK faca46b042
  BrandonOdiwuor:
    Code Review ACK faca46b042

Tree-SHA512: 866f1ccff0313017dd313d5a218d7ee088b823601a129b9ed4c5819b0d57fd808d78e3ea28ca00714ae6b209df5312b7b9dea091b2b028821ff46b8ba263c48a
2025-04-23 10:16:05 +01:00
Ryan Ofsky
dda2d4e176
Merge bitcoin/bitcoin#32113: fuzz: enable running fuzz test cases in Debug mode
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
3669ecd4cc doc: Document fuzz build options (Anthony Towns)
c1d01f59ac fuzz: enable running fuzz test cases in Debug mode (Anthony Towns)

Pull request description:

  When building with

      BUILD_FOR_FUZZING=OFF
      BUILD_FUZZ_BINARY=ON
      CMAKE_BUILD_TYPE=Debug

  allow the fuzz binary to execute given test cases (without actual fuzzing) to make it easier to reproduce fuzz test failures in a more normal debug build.

  In Debug builds, deterministic fuzz behaviour is controlled via a runtime variable, which is normally false, but set to true automatically in the fuzz binary, unless the FUZZ_NONDETERMINISM environment variable is set.

ACKs for top commit:
  maflcko:
    re-ACK 3669ecd4cc 🏉
  marcofleon:
    re ACK 3669ecd4cc
  ryanofsky:
    Code review ACK 3669ecd4cc with just variable renamed and documentation added since last review

Tree-SHA512: 5da5736462f98437d0aa1bd01aeacb9d46a9cc446a748080291067f7a27854c89f560f3a6481b760b9a0ea15a8d3ad90cd329ee2a008e5e347a101ed2516449e
2025-04-22 22:00:59 -04:00
MarcoFalke
faca46b042
test: Run all benchmarks in the sanity check 2025-04-22 19:07:18 +02:00
Hennadii Stepanov
e5a00b2497
Merge bitcoin/bitcoin#32309: bench: close wallets after migration
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
cad39f86fb bench: ensure wallet migration benchmark runs exactly once (Lőrinc)
c1f458aaa0 ci: re-enable all benchmark runs (Lőrinc)
1da11dbc44 bench: clean up migrated descriptor wallets via loader teardown (Lőrinc)

Pull request description:

  The low-priority `WalletMigration` benchmark existed for some time but was never run automatically in our CI.

  Although the failure first surfaced on Windows as a hang during temporary directory cleanup, it could also be reproduced on Linux and macOS when forcing multiple iterations (e.g. via a long `--min-time`).

  ### Root causes

  1. **Leaked open wallets on Windows**
     `MigrateLegacyToDescriptor` produces two new descriptor wallets (the primary spendable wallet and a companion watch‑only wallet). Without unloading them, their database files remained open in the `WalletContext`, blocking directory removal and hanging the test harness.

  <details><summary>Details</summary>

  ```bash
  what():  filesystem error: cannot remove all: The process cannot access the file because it is being used by another process [C:\Users\RUNNER\~1\AppData\Local\Temp\test_common bitcoin\WalletMigration\d8ffd89a7700ce01c31f] [C:\Users\RUNNER~1\AppData\Local\Temp\test_common bitcoin\WalletMigration\d8ffd89a7700ce01c31f\regtest\wallet.dat]
  ```

  </details>

  2. **Undefined behavior on repeated runs**
     The benchmark body calls `std::move(wallet)`, invalidating the local `wallet` pointer. Running more than one iteration causes a use-after-move by the sanitizers.

  <details><summary>Details</summary>

  ```bash
  error: bench_bitcoin 0x00067927: DW_TAG_member '_M_local_buf' refers to type 0x00000000000b3ba7 which extends beyond the bounds of 0x0006791d
  * thread #1, name = 'b-test', stop reason = signal SIGSEGV: address not mapped to object (fault address: 0xc8)
    * frame #0: 0x00005555556a3f33 bench_bitcoin`... basic_string<char>::length(this=<unavailable>) const at basic_string.h:1079:16
  ```

  </details>

  ### Fixes

  - **Automatic wallet teardown**
    Wrap the benchmark in a `MakeWalletLoader` (owning a `WalletContext`), so that both migrated wallets are unloaded when the loader goes out of scope, eliminating any lingering open files.

  - **Re-enable benchmarks in CI**
    Drop the temporary filter in GitHub Actions. The `-sanity-check` run already executes each benchmark once, so `WalletMigration` now runs automatically without hangs or crashes.

  - **Single iteration**
    Configure the microbenchmark with `.epochs(1).epochIterations(1)`, ensuring the migration code runs exactly once and avoiding use-after-move.

  No measurable change in benchmark performance.

ACKs for top commit:
  maflcko:
    review ACK cad39f86fb 🍥
  furszy:
    utACK cad39f86fb
  hebasto:
    ACK cad39f86fb, tested on Ubuntu 25.04.

Tree-SHA512: 10343ce7ab9b63ba4f51a7673018215577ea7ec188e41d535a66d69d73b85bca6ba301c33f6920c02f8f7d686c75c65c4a4e9bdafb04b60be85d66aa743cfa20
2025-04-22 17:42:17 +01:00
merge-script
8406a9f4f1
Merge bitcoin/bitcoin#32325: ci: Add missing -Wno-error=array-bounds to valgrind fuzz
fa653cb416 ci: Add missing -Wno-error=array-bounds to valgrind fuzz (MarcoFalke)

Pull request description:

  Due to an upstream GCC issue, any debug/fuzz build which aborts on failed assumes will print a false positive array-bounds warning in `src/test/fuzz/txgraph.cpp`.

  This also affects one CI task.

  Fix the CI task by ignoring the error for now.

  Fixes https://github.com/bitcoin/bitcoin/issues/32276

ACKs for top commit:
  fanquake:
    ACK fa653cb416 - checked native fuzz

Tree-SHA512: 0f6c5ec8d96e0bf96cd008e2de5db59e528086a67dcb77f3e59a0d83225d880a59e960d65c5bc8b5ae3de9d5d301bfc7737d95c282aa1bcc740a42561f610ca8
2025-04-22 17:02:38 +01:00
MarcoFalke
fadf12a56c
test: Add missing check for empty stderr in util tester 2025-04-22 17:54:38 +02:00
MarcoFalke
fa653cb416
ci: Add missing -Wno-error=array-bounds to valgrind fuzz 2025-04-22 16:20:47 +02:00
fanquake
08aa7fe232
ci: clang-tidy 20 2025-04-22 13:16:54 +01:00
fanquake
2b85d31bcc
refactor: starts/ends_with changes for clang-tidy 20 2025-04-22 13:16:54 +01:00
merge-script
96a5cd8000
Merge bitcoin/bitcoin#32293: doc: Add deps install notes for multiprocess
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
7f5a35cf4b doc: Add deps install notes for multiprocess (TheCharlatan)

Pull request description:

  These just mirror the content in src/ipc/libmultiprocess/doc/install.md

ACKs for top commit:
  Sjors:
    re-ACK 7f5a35cf4b
  ryanofsky:
    Code review ACK 7f5a35cf4b just dropping dependencies.md update since last review

Tree-SHA512: f9bf4f54542323aa4a0600db874640e575e40355f08515331a27fb139e6e47ee58aa0c6635206f978696e3da7b5aa93efb45b181b02e99e308537fcb90bd6751
2025-04-22 13:16:08 +01:00
merge-script
2844adc8ba
Merge bitcoin/bitcoin#32308: ci: Drop no longer necessary -Wno-error=array-bounds
e34f12bdd4 ci: Drop no longer necessary `-Wno-error=array-bounds` (Hennadii Stepanov)

Pull request description:

  The build log of the "Linux->Windows cross" CI job no longer shows any `-Warray-bounds` compiler warnings. Therefore, there's no need to suppress them with `-Wno-error=array-bounds`.

  I likely overlooked this when reviewing https://github.com/bitcoin/bitcoin/pull/29881, as I can run that CI job locally without such warnings even at commit 785649f397.

ACKs for top commit:
  TheCharlatan:
    ACK e34f12bdd4

Tree-SHA512: ac66160866097538af6f196c0cb22d370427c59c071b0ddcb1a6717e233bbd3dfed4e090d266221c55ae0ddd3d5dffb0ca7ae01582eda07f25fb886a775b6ac5
2025-04-22 13:13:11 +01:00
Lőrinc
cad39f86fb bench: ensure wallet migration benchmark runs exactly once
The migration benchmark crashes if run more than once, because of `std::move(wallet)` and leaves subsequent iterations in an undefined state - avoiding `UndefinedBehaviorSanitizer` null‑dereference error.
2025-04-22 12:50:26 +02:00
Lőrinc
c1f458aaa0 ci: re-enable all benchmark runs
Drop the temporary `-filter` that excluded the `WalletMigration` benchmark.
2025-04-22 12:49:53 +02:00
Lőrinc
1da11dbc44 bench: clean up migrated descriptor wallets via loader teardown
`MigrateLegacyToDescriptor` returns both a spendable descriptor wallet and a watch‑only wallet.
If these remain attached, their files stay open and on Windows this can hang CI when removing the test directory.

By constructing them via `MakeWalletLoader` (which owns the `WalletContext`), both wallets are automatically unloaded when the loader is destroyed at the end.
This ensures no lingering handles or resource leaks when running the benchmark on CI with `-sanity-check`.

Co-authored-by: furszy <matiasfurszyfer@protonmail.com>
2025-04-22 12:41:04 +02:00
Anthony Towns
3669ecd4cc doc: Document fuzz build options
Co-Authored-By: Ryan Ofsky <ryan@ofsky.org>
2025-04-22 17:11:24 +10:00
Anthony Towns
c1d01f59ac fuzz: enable running fuzz test cases in Debug mode
When building with

 BUILD_FOR_FUZZING=OFF
 BUILD_FUZZ_BINARY=ON
 CMAKE_BUILD_TYPE=Debug

allow the fuzz binary to execute given test cases (without actual
fuzzing) to make it easier to reproduce fuzz test failures in a more
normal debug build.

In Debug builds, deterministic fuzz behaviour is controlled via a runtime
variable, which is normally false, but set to true automatically in the
fuzz binary, unless the FUZZ_NONDETERMINISM environment variable is set.
2025-04-22 17:11:24 +10:00
Ava Chow
06439a14c8
Merge bitcoin/bitcoin#31953: rpc: Allow fullrbf fee bump in (psbt)bumpfee
Some checks are pending
CI / test each commit (push) Waiting to run
CI / macOS 14 native, arm64, no depends, sqlite only, gui (push) Waiting to run
CI / macOS 14 native, arm64, fuzz (push) Waiting to run
CI / Windows native, VS 2022 (push) Waiting to run
CI / Windows native, fuzz, VS 2022 (push) Waiting to run
CI / Linux->Windows cross, no tests (push) Waiting to run
CI / Windows, test cross-built (push) Blocked by required conditions
CI / ASan + LSan + UBSan + integer, no depends, USDT (push) Waiting to run
fa86190e6e rpc: Allow fullrbf fee bump (MarcoFalke)

Pull request description:

  The RPCs (psbt)bumpfee, and the GUI, reject fee bumps when BIP 125 signalling is absent in the transaction even when the mempool and other RPCs allow them. Fix the confusion by allowing the fee bump.

  This is done after fullrbf is always on (https://github.com/bitcoin/bitcoin/pull/30592)

ACKs for top commit:
  1440000bytes:
    reACK fa86190e6e
  achow101:
    ACK fa86190e6e
  w0xlt:
    ACK fa86190e6e
  rkrux:
    reACK fa86190e6e
  glozow:
    ACK fa86190e6e

Tree-SHA512: b2ffe8dcadbe71e9be767a16cf8aa0bf383c2de7aa1aee9438d125f444e24f3f7e4f02ddb28981bd3b8b645b6a24a407b4ad6bb0b21946ae637e78f6386e05bf
2025-04-21 13:25:52 -07:00
merge-script
3e78ac6811
Merge bitcoin/bitcoin#31243: descriptor: Move filling of keys from DescriptorImpl::MakeScripts to PubkeyProvider::GetPubKey
acee5c59e6 descriptors: Have GetPrivKey fill keys directly (Ava Chow)
4b0303197e descriptors: Move FlatSigningProvider pubkey filling to GetPubKey (Ava Chow)
25a3b9b0f5 descriptors: Have GetPubKey fill origins directly (Ava Chow)
6268bde0af descriptor: Remove unused parent_info from BIP32PUbKeyProvider::GetPubKey (Ava Chow)
0ff072caa1 wallet, rpc: Only allow keypool import from single key descriptors (Ava Chow)

Pull request description:

  Instead of having `MakeScripts` infer what pubkeys need to go into the output `FlatSigningProvider`, have each of the `PubkeyProviders` that have `GetPubKey` and `GetPrivKey` called fill it directly with relevant keys and origins.

  This allows for keys and origins to be added that won't directly appear in the output, which is necessary for `musig()` descriptors.

  Split from #29675

ACKs for top commit:
  fjahr:
    Code review ACK acee5c59e6
  theStack:
    re-ACK acee5c59e6
  rkrux:
    ACK acee5c5

Tree-SHA512: c1841359bcb08cdd433122deef96579236928660785f3357a3eb584e47d290cd1c60ebe8f7fba50f178ba45c9a90773124e0f509e36c5a0df97c1a4890e03e5c
2025-04-21 14:53:55 -04:00
merge-script
728e86e3f3
Merge bitcoin/bitcoin#31640: tests: improves tapscript unit tests
e3d7533ac9 test: improves tapscript unit tests (Ethan Heilman)
3e167085ba test: Ensures test fails if witness is not hex (Ethan Heilman)

Pull request description:

  This commit creates new test utilities for future Taproot script tests within script_tests.json. The key features of this commit are the addition of three new tags: `#SCRIPT#`, `#CONTROLBLOCK#`, and `#TAPROOTOUTPUT#`. These tags streamline the test creation process by eliminating the need to manually generate these components outside the test suite.

  * `#SCRIPT#`: Parses Tapscript and outputs a byte string of opcodes.
  * `#CONTROLBLOCK#`: Automatically generates the control block for a given Taproot output.
  * `#TAPROOTOUTPUT#`: Generates the final Taproot scriptPubKey.

  This code was originally part of the OP_CAT PR https://github.com/bitcoin/bitcoin/pull/29247 but was pulled out into a separate PR to reduce the rebase treadmill for the OP_CAT PR.

  Additionally this PR adds a check to ensure that if the witness data can not be parsed as hex the test fails. Prior to this PR, the test code would fail silently and set the values it couldn't parse as empty stack elements. This fix was suggested by @instagibbs.

  ## Rationale

  While writing JSON script tests (script_tests.json) for https://github.com/bitcoin/bitcoin/pull/29247 we ran into the following problem. The JSON script tests are simple and easy to write for pre-Tapscript scripts, but adding or changing a Tapscript test requires substantial work per test. Consider the following pre-tapscript test:

  ```
  ["'aa' 'bb'", "CAT 0x4c 0x02 0xaabb EQUAL", "P2SH,STRICTENC", "DISABLED_OPCODE", "CAT disabled"]
  ````

  whereas a Tapscript test for the same script (annotated with comments for better readability) would look like:

  ```
  [
      [
          "aa",
          "bb",
          "7e4c02aabb87", // output script
          "c0d6889cb081036e0faefa3a35157ad71086b123b2b144b649798b494c300a961d", // control block
          0.00000001
      ],
      "",
      "0x51 0x20 0x15048ed3a65748549c27b671936987093cf73a4c9cb18522a74fb9553060ca99", // Tapscript output
      "P2SH,WITNESS,TAPROOT",
      "OK",
      "TAPSCRIPT CATs aa and bb together and checks if EQUAL to aabb"
  ]
  ```

  Computing the Tapscript output, such as `0x51 0x20 0x15048ed3a65748549c27b671936987093cf73a4c9cb18522a74fb9553060ca99`, requires writing custom code and running it for each test. The same is true for the Tapscript control block, such as `c0d6889cb081036e0faefa3a35157ad71086b123b2b144b649798b494c300a961d`. If a test is changed or updated new outputs and control blocks must be computed. The complexity of doing this is likely the reason that no one has added any Tapscript tests to JSON script tests until this PR.

  In this PR we address this issue by adding the following improvements to JSON script tests:

  Adding simple macros ("#SCRIPT# and #CONTROLBLOCK#) that allow the script test parser to automatically generate and inject a valid Tapscript output and control block to be computed automatically from the JSON script.
  Allowing Tapscript scripts to use the human readable strings like pre-script scripts by marking the location of the script in the witness stack using #SCRIPT#. This transforms the unreadable script 7e4c02aabb87 into #SCRIPT# CAT 0x4c 0x02 0xaabb EQUAL.
  This results in the following JSON script test which is far easier to write and easier to read.

  ```
  [
      [
          "aa",
          "bb",
          "#SCRIPT# CAT",
          "#CONTROLBLOCK#",
          0.00000001
      ],
      "",
      "0x51 0x20 #TAPROOTOUTPUT#",
      "P2SH,WITNESS,TAPROOT,OP_CAT",
      "OK",
      "TAPSCRIPT Test of OP_CAT flag by calling CAT on two elements. TAPSCRIPT_OP_CAT flag is set so CAT is executed."
  ],
  ```

ACKs for top commit:
  instagibbs:
    reACK e3d7533ac9
  sipa:
    utACK e3d7533ac9
  janb84:
    Re ACK [e3d7533](e3d7533ac9)

Tree-SHA512: 948c3ec28a4b2b222c2d77e48918ed19d298b51d64662fc20959073edd9978fc796516a392da9755a7e173f556e3021816dc6ce8eb3ed16bbe0fa6ebc574fd48
2025-04-21 14:50:48 -04:00
Ethan Heilman
e3d7533ac9 test: improves tapscript unit tests
This commit creates new test utilities for future Taproot script
tests within script_tests.json. The key features of this commit are the
addition of three new tags: `#SCRIPT#`, `#CONTROLBLOCK#`, and
`#TAPROOTOUTPUT#`. These tags streamline the test creation process by
eliminating the need to manually generate these components outside the
test suite.

* `#SCRIPT#`: Parses Tapscript and outputs a byte string of opcodes.
* `#CONTROLBLOCK#`: Automatically generates the control block for a given
Taproot output.
* `#TAPROOTOUTPUT#`: Generates the final Taproot scriptPubKey.

Update src/test/script_tests.cpp

Co-authored-by: Jan B <608446+janb84@users.noreply.github.com>
2025-04-21 11:38:15 -04:00
Hennadii Stepanov
e34f12bdd4
ci: Drop no longer necessary -Wno-error=array-bounds 2025-04-19 08:11:47 +01:00
TheCharlatan
7f5a35cf4b
doc: Add deps install notes for multiprocess
These just mirror the content in src/ipc/libmultiprocess/doc/install.md
2025-04-17 20:26:43 +02:00
MarcoFalke
fa86190e6e
rpc: Allow fullrbf fee bump
Also, fix the incorrect documention of the 'replaceable' RPC argument
with respect to sequence number handling. The docs were incorrect
before, so the fix could be extracted, but it seems fine to include here
as well.
2025-04-17 13:12:26 +02:00
Hennadii Stepanov
513e2020a9
guix: Remove unused file package
The `file` utility has not been required since Guix builds were
introduced.
2025-04-17 10:33:01 +01:00
Ava Chow
acee5c59e6 descriptors: Have GetPrivKey fill keys directly
Instead of GetPrivKey returning a key and having the caller fill the
FlatSigningProvider, have GetPrivKey take the FlatSigningProvider and
fill it by itself. This will be necessary for descriptors such as
musig() where there are private keys that need to be added to the
FlatSigningProvider but do not directly appear in any resulting scripts.

GetPrivKey is now changed to void as the caller no longer cares whether
it succeeds or fails.
2025-04-14 16:32:01 -07:00
Ava Chow
4b0303197e descriptors: Move FlatSigningProvider pubkey filling to GetPubKey
Instead of MakeScripts inconsistently filling the output
FlatSigningProvider with the pubkeys involved, just do it in GetPubKey.
2025-04-14 16:32:01 -07:00
Ava Chow
25a3b9b0f5 descriptors: Have GetPubKey fill origins directly
Instead of having ExpandHelper fill in the origins in the
FlatSigningProvider output, have GetPubKey do it by itself. This reduces
the extra variables needed in order to track and set origins in
ExpandHelper.

Also changes GetPubKey to return a std::optional<CPubKey> rather than
using a bool and output parameters.
2025-04-14 16:32:01 -07:00
Ava Chow
6268bde0af descriptor: Remove unused parent_info from BIP32PUbKeyProvider::GetPubKey 2025-04-14 16:32:01 -07:00
Ava Chow
0ff072caa1 wallet, rpc: Only allow keypool import from single key descriptors
Legacy wallets should only import keys to the keypool if they came in a
single key descriptor. Instead of relying on assumptions about the
descriptor based on how many pubkeys show up after expanding the
descriptor, explicitly mark descriptors as being single key type and use
that for the check.
2025-04-14 16:32:01 -07:00
Hennadii Stepanov
d0cce4172c depends: Fix mv command compatibility with macOS 2025-04-03 19:03:52 +01:00
Hennadii Stepanov
690f5da15a depends: Specify Objective C/C++ compilers for native_qt package
This change fixes cross-compilation from macOS to macOS with another
architecture.
2025-04-03 19:03:27 +01:00
Matt Corallo
3c3548a70e validation: clarify final |= BLOCK_FAILED_VALID in InvalidateBlock
This has no functional affect, as the any CBlockIndex*s which
to_mark_failed is set to will already have been marked failed.

Also prevents a situation where block already marked as
BLOCK_FAILED_CHILD is again unconditionally marked as
BLOCK_FAILED_VALID in the final |= BLOCK_FAILED_VALID.
2025-03-31 08:37:07 +05:30
stratospher
aac5488909 validation: correctly update BlockStatus for invalid block descendants
invalid_block ----------> block_index

- before this commit, only if block_index is not invalid, it will mark
  block_index as BLOCK_FAILED_CHILD
- it's possible that block_index encountered is invalid and was marked
  as BLOCK_FAILED_VALID previously
- in this case, correctly update BlockStatus of block_index by
  clearing BLOCK_FAILED_VALID and then setting it to BLOCK_FAILED_CHILD
2025-03-31 08:37:07 +05:30
stratospher
9e29653b42 test: check BlockStatus when InvalidateBlock is used
when a block is invalidated using InvalidateBlock, check that:
1. it's status is BLOCK_FAILED_VALID
2. it's children's status is BLOCK_FAILED_CHILD
   and not BLOCK_FAILED_VALID
3. it's ancestors are valid
2025-03-31 08:37:00 +05:30
stratospher
c99667583d validation: fix traversal condition to mark BLOCK_FAILED_CHILD
this block of code is not reached on master since other than
initialisation, all other iterations have invalid_walk_tip
and to_mark_failed pointers in some form of this layout
where 1, 2, 3 and 4 are block heights.

	invalid_walk_tip
	  ↓
1 <- 2 <- 3 <- 4
	       ↑
	      to_mark_failed

fix it so that blocks are correctly marked as BLOCK_FAILED_CHILD
if it's a descendant of BLOCK_FAILED_VALID block.
2025-03-31 08:26:37 +05:30
Ethan Heilman
3e167085ba test: Ensures test fails if witness is not hex
This commit ensures that we do not fail silently when the test script encounters a witness string in the JSON test data that can not be parsed as hex.
2025-03-29 15:39:02 -04:00
David Gumberg
77e553ab6a build: refactor: hardening flags -> core_interface 2025-03-24 13:38:12 -07:00
David Gumberg
00ba3ba303 build: Drop option for disabling hardening
Building unhardened executables is not a supported use case that should
be maintained and those that want unhardened executables can still
override them by appending disable flags.

For example:

cmake -B build -DAPPEND_CPPFLAGS='-U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=0 -fno-stack-protector -fcf-protection=none -fno-stack-clash-protection' -DAPPEND_LDFLAGS='-Wl,-z,lazy -Wl,-z,norelro -Wl,-z,noseparate-code'
2025-03-24 13:37:49 -07:00
David Gumberg
f57db75e91 build: Use -z noseparate-code on NetBSD < 11.0
This can be dropped once Bitcoin Core no longer supports NetBSD 10.0 or
if upstream fix is backported.

NetBSD's dynamic linker ld.elf_so < 11.0 supports exactly 2 `PT_LOAD`
segments and binaries linked with `-z separate-code` have 4 `PT_LOAD`
segments.

https://github.com/bitcoin/bitcoin/pull/28724#issuecomment-2589347934
https://mail-index.netbsd.org/tech-userlevel/2023/01/05/msg013666.html
2025-03-24 13:37:10 -07:00
Saikiran
55b931934a removed duplicate calling of GetDescriptorScriptPubKeyMan
Removed duplicate call to GetDescriptorScriptPubKeyMan and
Instead of checking linearly I have used find method so time complexity reduced significantly for GetDescriptorScriptPubKeyMan
after this fix improved performance of importdescriptor part refs #32013.
2025-03-24 17:27:27 +05:30
177 changed files with 1331 additions and 5976 deletions

View file

@ -358,8 +358,7 @@ jobs:
./src/univalue/unitester.exe
- name: Run benchmarks
# TODO: Fix the `WalletMigration` benchmark and re-enable it.
run: ./bin/bench_bitcoin.exe -sanity-check -filter='^(?!WalletMigration$).+$'
run: ./bin/bench_bitcoin.exe -sanity-check
- name: Adjust paths in test/config.ini
shell: pwsh

View file

@ -128,7 +128,6 @@ if(WITH_BDB)
endif()
cmake_dependent_option(BUILD_WALLET_TOOL "Build bitcoin-wallet tool." ${BUILD_TESTS} "ENABLE_WALLET" OFF)
option(ENABLE_HARDENING "Attempt to harden the resulting executables." ON)
option(REDUCE_EXPORTS "Attempt to reduce exported symbols in the resulting executables." OFF)
option(WERROR "Treat compiler warnings as errors." OFF)
option(WITH_CCACHE "Attempt to use ccache for compiling." ON)
@ -502,63 +501,71 @@ try_append_cxx_flags("-fmacro-prefix-map=A=B" TARGET core_interface SKIP_LINK
# -fstack-reuse=none for all gcc builds. (Only gcc understands this flag).
try_append_cxx_flags("-fstack-reuse=none" TARGET core_interface)
if(ENABLE_HARDENING)
add_library(hardening_interface INTERFACE)
target_link_libraries(core_interface INTERFACE hardening_interface)
if(MSVC)
try_append_linker_flag("/DYNAMICBASE" TARGET hardening_interface)
try_append_linker_flag("/HIGHENTROPYVA" TARGET hardening_interface)
try_append_linker_flag("/NXCOMPAT" TARGET hardening_interface)
else()
if(MSVC)
try_append_linker_flag("/DYNAMICBASE" TARGET core_interface)
try_append_linker_flag("/HIGHENTROPYVA" TARGET core_interface)
try_append_linker_flag("/NXCOMPAT" TARGET core_interface)
else()
# _FORTIFY_SOURCE requires that there is some level of optimization,
# otherwise it does nothing and just creates a compiler warning.
try_append_cxx_flags("-U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3"
RESULT_VAR cxx_supports_fortify_source
SOURCE "int main() {
# if !defined __OPTIMIZE__ || __OPTIMIZE__ <= 0
#error
#endif
}"
# _FORTIFY_SOURCE requires that there is some level of optimization,
# otherwise it does nothing and just creates a compiler warning.
try_append_cxx_flags("-U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3"
RESULT_VAR cxx_supports_fortify_source
SOURCE "int main() {
# if !defined __OPTIMIZE__ || __OPTIMIZE__ <= 0
#error
#endif
}"
)
if(cxx_supports_fortify_source)
target_compile_options(core_interface INTERFACE
-U_FORTIFY_SOURCE
-D_FORTIFY_SOURCE=3
)
if(cxx_supports_fortify_source)
target_compile_options(hardening_interface INTERFACE
-U_FORTIFY_SOURCE
-D_FORTIFY_SOURCE=3
)
endif()
unset(cxx_supports_fortify_source)
endif()
unset(cxx_supports_fortify_source)
try_append_cxx_flags("-Wstack-protector" TARGET hardening_interface SKIP_LINK)
try_append_cxx_flags("-fstack-protector-all" TARGET hardening_interface)
try_append_cxx_flags("-fcf-protection=full" TARGET hardening_interface)
try_append_cxx_flags("-Wstack-protector" TARGET core_interface SKIP_LINK)
try_append_cxx_flags("-fstack-protector-all" TARGET core_interface)
try_append_cxx_flags("-fcf-protection=full" TARGET core_interface)
if(MINGW)
# stack-clash-protection is a no-op for Windows.
# See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=90458 for more details.
else()
try_append_cxx_flags("-fstack-clash-protection" TARGET hardening_interface)
endif()
if(MINGW)
# stack-clash-protection is a no-op for Windows.
# See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=90458 for more details.
else()
try_append_cxx_flags("-fstack-clash-protection" TARGET core_interface)
endif()
if(CMAKE_SYSTEM_PROCESSOR STREQUAL "aarch64" OR CMAKE_SYSTEM_PROCESSOR STREQUAL "arm64")
if(CMAKE_SYSTEM_NAME STREQUAL "Darwin")
try_append_cxx_flags("-mbranch-protection=bti" TARGET hardening_interface SKIP_LINK)
else()
try_append_cxx_flags("-mbranch-protection=standard" TARGET hardening_interface SKIP_LINK)
endif()
endif()
try_append_linker_flag("-Wl,--enable-reloc-section" TARGET hardening_interface)
try_append_linker_flag("-Wl,--dynamicbase" TARGET hardening_interface)
try_append_linker_flag("-Wl,--nxcompat" TARGET hardening_interface)
try_append_linker_flag("-Wl,--high-entropy-va" TARGET hardening_interface)
try_append_linker_flag("-Wl,-z,relro" TARGET hardening_interface)
try_append_linker_flag("-Wl,-z,now" TARGET hardening_interface)
try_append_linker_flag("-Wl,-z,separate-code" TARGET hardening_interface)
if(CMAKE_SYSTEM_PROCESSOR STREQUAL "aarch64" OR CMAKE_SYSTEM_PROCESSOR STREQUAL "arm64")
if(CMAKE_SYSTEM_NAME STREQUAL "Darwin")
try_append_linker_flag("-Wl,-fixup_chains" TARGET hardening_interface)
try_append_cxx_flags("-mbranch-protection=bti" TARGET core_interface SKIP_LINK)
else()
try_append_cxx_flags("-mbranch-protection=standard" TARGET core_interface SKIP_LINK)
endif()
endif()
try_append_linker_flag("-Wl,--enable-reloc-section" TARGET core_interface)
try_append_linker_flag("-Wl,--dynamicbase" TARGET core_interface)
try_append_linker_flag("-Wl,--nxcompat" TARGET core_interface)
try_append_linker_flag("-Wl,--high-entropy-va" TARGET core_interface)
try_append_linker_flag("-Wl,-z,relro" TARGET core_interface)
try_append_linker_flag("-Wl,-z,now" TARGET core_interface)
# TODO: This can be dropped once Bitcoin Core no longer supports
# NetBSD 10.0 or if upstream fix is backported.
# NetBSD's dynamic linker ld.elf_so < 11.0 supports exactly 2
# `PT_LOAD` segments and binaries linked with `-z separate-code`
# have 4 `PT_LOAD` segments.
# Relevant discussions:
# - https://github.com/bitcoin/bitcoin/pull/28724#issuecomment-2589347934
# - https://mail-index.netbsd.org/tech-userlevel/2023/01/05/msg013666.html
if(CMAKE_SYSTEM_NAME STREQUAL "NetBSD" AND CMAKE_SYSTEM_VERSION VERSION_LESS 11.0)
try_append_linker_flag("-Wl,-z,noseparate-code" TARGET core_interface)
else()
try_append_linker_flag("-Wl,-z,separate-code" TARGET core_interface)
endif()
if(CMAKE_SYSTEM_NAME STREQUAL "Darwin")
try_append_linker_flag("-Wl,-fixup_chains" TARGET core_interface)
endif()
endif()
if(REDUCE_EXPORTS)
@ -703,7 +710,6 @@ message("Cross compiling ....................... ${cross_status}")
message("C++ compiler .......................... ${CMAKE_CXX_COMPILER_ID} ${CMAKE_CXX_COMPILER_VERSION}, ${CMAKE_CXX_COMPILER}")
include(FlagsSummary)
flags_summary()
message("Attempt to harden executables ......... ${ENABLE_HARDENING}")
message("Treat compiler warnings as errors ..... ${WERROR}")
message("Use ccache for compiling .............. ${WITH_CCACHE}")
message("\n")

View file

@ -77,7 +77,6 @@
"BUILD_UTIL_CHAINSTATE": "ON",
"BUILD_WALLET_TOOL": "ON",
"ENABLE_EXTERNAL_SIGNER": "ON",
"ENABLE_HARDENING": "ON",
"ENABLE_WALLET": "ON",
"WARN_INCOMPATIBLE_BDB": "OFF",
"WITH_BDB": "ON",

View file

@ -17,4 +17,5 @@ export FUZZ_TESTS_CONFIG="--valgrind"
export GOAL="all"
export BITCOIN_CONFIG="\
-DBUILD_FOR_FUZZING=ON \
-DCMAKE_CXX_FLAGS='-Wno-error=array-bounds' \
"

View file

@ -8,7 +8,7 @@ export LC_ALL=C.UTF-8
export CI_IMAGE_NAME_TAG="mirror.gcr.io/ubuntu:24.04"
export CONTAINER_NAME=ci_native_tidy
export TIDY_LLVM_V="19"
export TIDY_LLVM_V="20"
export APT_LLVM_V="${TIDY_LLVM_V}"
export PACKAGES="clang-${TIDY_LLVM_V} libclang-${TIDY_LLVM_V}-dev llvm-${TIDY_LLVM_V}-dev libomp-${TIDY_LLVM_V}-dev clang-tidy-${TIDY_LLVM_V} jq libevent-dev libboost-dev libzmq3-dev systemtap-sdt-dev qt6-base-dev qt6-tools-dev qt6-l10n-tools libqrencode-dev libsqlite3-dev libdb++-dev"
export NO_DEPENDS=1

View file

@ -15,4 +15,4 @@ export RUN_UNIT_TESTS=false
export RUN_FUNCTIONAL_TESTS=false
export GOAL="deploy"
export BITCOIN_CONFIG="-DREDUCE_EXPORTS=ON -DBUILD_GUI_TESTS=OFF \
-DCMAKE_CXX_FLAGS='-Wno-error=maybe-uninitialized -Wno-error=array-bounds'"
-DCMAKE_CXX_FLAGS='-Wno-error=maybe-uninitialized'"

View file

@ -137,8 +137,7 @@ export GUIX_LD_WRAPPER_DISABLE_RPATH=yes
# Make /usr/bin if it doesn't exist
[ -e /usr/bin ] || mkdir -p /usr/bin
# Symlink file and env to a conventional path
[ -e /usr/bin/file ] || ln -s --no-dereference "$(command -v file)" /usr/bin/file
# Symlink env to a conventional path
[ -e /usr/bin/env ] || ln -s --no-dereference "$(command -v env)" /usr/bin/env
# Determine the correct value for -Wl,--dynamic-linker for the current $HOST

View file

@ -6,7 +6,6 @@
(gnu packages commencement)
(gnu packages compression)
(gnu packages cross-base)
(gnu packages file)
(gnu packages gawk)
(gnu packages gcc)
((gnu packages installers) #:select (nsis-x86_64))
@ -531,7 +530,6 @@ inspecting signatures in Mach-O binaries.")
which
coreutils-minimal
;; File(system) inspection
file
grep
diffutils
findutils

View file

@ -88,6 +88,10 @@ $(package)_config_opts += -no-feature-qtplugininfo
$(package)_config_env := CC="$$(build_CC)"
$(package)_config_env += CXX="$$(build_CXX)"
ifeq ($(build_os),darwin)
$(package)_config_env += OBJC="$$(build_CC)"
$(package)_config_env += OBJCXX="$$(build_CXX)"
endif
$(package)_cmake_opts := -DCMAKE_EXE_LINKER_FLAGS="$$(build_LDFLAGS)"
ifneq ($(V),)
@ -148,5 +152,5 @@ endef
define $(package)_postprocess_cmds
rm -rf doc/ && \
mv -t .. translations/
mv translations/ ..
endef

View file

@ -190,6 +190,14 @@ ifneq ($(host),$(build))
$(package)_cmake_opts += -DCMAKE_SYSTEM_NAME=$($(host_os)_cmake_system_name)
$(package)_cmake_opts += -DCMAKE_SYSTEM_VERSION=$($(host_os)_cmake_system_version)
$(package)_cmake_opts += -DCMAKE_SYSTEM_PROCESSOR=$(host_arch)
# Native packages cannot be used during cross-compiling. However,
# Qt still unconditionally tries to find them, which causes issues
# in some cases, such as when cross-compiling from macOS to Windows.
# Explicitly disable this unnecessary Qt behaviour.
$(package)_cmake_opts += -DCMAKE_DISABLE_FIND_PACKAGE_Libb2=TRUE
$(package)_cmake_opts += -DCMAKE_DISABLE_FIND_PACKAGE_WrapSystemDoubleConversion=TRUE
$(package)_cmake_opts += -DCMAKE_DISABLE_FIND_PACKAGE_WrapSystemMd4c=TRUE
$(package)_cmake_opts += -DCMAKE_DISABLE_FIND_PACKAGE_WrapZSTD=TRUE
endif
ifeq ($(host_os),darwin)
$(package)_cmake_opts += -DCMAKE_INSTALL_NAME_TOOL=true

View file

@ -125,6 +125,19 @@ For more information on ZMQ, see: [zmq.md](zmq.md)
---
### IPC Dependencies
Compiling IPC-enabled binaries with `-DENABLE_IPC=ON` requires the following dependency.
Skip if you do not need IPC functionality.
```bash
brew install capnp
```
For more information on IPC, see: [multiprocess.md](multiprocess.md).
---
#### Test Suite Dependencies
There is an included test suite that is useful for testing code changes when developing.

View file

@ -68,6 +68,11 @@ User-Space, Statically Defined Tracing (USDT) dependencies:
sudo apt install systemtap-sdt-dev
IPC-enabled binaries are compiled with `-DENABLE_IPC=ON` and require the following dependencies.
Skip if you do not need IPC functionality.
sudo apt-get install libcapnp-dev capnproto
GUI dependencies:
Bitcoin Core includes a GUI built with the cross-platform Qt Framework. To compile the GUI, we need to install
@ -118,6 +123,11 @@ User-Space, Statically Defined Tracing (USDT) dependencies:
sudo dnf install systemtap-sdt-devel
IPC-enabled binaries are compiled with `-DENABLE_IPC=ON` and require the following dependency.
Skip if you do not need IPC functionality.
sudo dnf install capnproto
GUI dependencies:
Bitcoin Core includes a GUI built with the cross-platform Qt Framework. To compile the GUI, we need to install

View file

@ -19,7 +19,7 @@ One can use `--preset=libfuzzer-nosan` to do the same without common sanitizers
See [further](#run-without-sanitizers-for-increased-throughput) for more information.
There is also a runner script to execute all fuzz targets. Refer to
`./test/fuzz/test_runner.py --help` for more details.
`./build_fuzz/test/fuzz/test_runner.py --help` for more details.
## Overview of Bitcoin Core fuzzing
@ -150,6 +150,37 @@ If you find coverage increasing inputs when fuzzing you are highly encouraged to
Every single pull request submitted against the Bitcoin Core repo is automatically tested against all inputs in the [`bitcoin-core/qa-assets`](https://github.com/bitcoin-core/qa-assets) repo. Contributing new coverage increasing inputs is an easy way to help make Bitcoin Core more robust.
## Building and debugging fuzz tests
There are 3 ways fuzz tests can be built:
1. With `-DBUILD_FOR_FUZZING=ON` which forces on fuzz determinism (skipping
proof of work checks, disabling random number seeding, disabling clock time)
and causes `Assume()` checks to abort on failure.
This is the normal way to run fuzz tests and generate new inputs. Because
determinism is hardcoded on in this build, only the fuzz binary can be built
and all other binaries are disabled.
2. With `-DBUILD_FUZZ_BINARY=ON -DCMAKE_BUILD_TYPE=Debug` which causes
`Assume()` checks to abort on failure, and enables fuzz determinism, but
makes it optional.
Determinism is turned on in the fuzz binary by default, but can be turned off
by setting the `FUZZ_NONDETERMINISM` environment variable to any value, which
may be useful for running fuzz tests with code that deterministic execution
would otherwise skip.
Since `BUILD_FUZZ_BINARY`, unlike `BUILD_FOR_FUZZING`, does not hardcode on
determinism, this allows non-fuzz binaries to coexist in the same build,
making it possible to reproduce fuzz test failures in a normal build.
3. With `-DBUILD_FUZZ_BINARY=ON -DCMAKE_BUILD_TYPE=Release`. In this build, the
fuzz binary will build but refuse to run, because in release builds
determinism is forced off and `Assume()` checks do not abort, so running the
tests would not be useful. This build is only useful for ensuring fuzz tests
compile and link.
## macOS hints for libFuzzer
The default Clang/LLVM version supplied by Apple on macOS does not include

View file

@ -0,0 +1,9 @@
# RPC
- The RPCs psbtbumpfee and bumpfee allow a replacement under fullrbf and no
longer require BIP-125 signalling. (#31953)
# GUI
- A transaction's fee bump is allowed under fullrbf and no longer requires
BIP-125 signalling. (#31953)

View file

@ -79,8 +79,8 @@ if(ENABLE_WALLET)
target_link_libraries(bench_bitcoin bitcoin_wallet)
endif()
add_test(NAME bench_sanity_check_high_priority
COMMAND bench_bitcoin -sanity-check -priority-level=high
add_test(NAME bench_sanity_check
COMMAND bench_bitcoin -sanity-check
)
install_binary_component(bench_bitcoin)

View file

@ -8,6 +8,7 @@
#include <key.h>
#include <prevector.h>
#include <random.h>
#include <script/script.h>
#include <cstddef>
#include <cstdint>
@ -16,7 +17,6 @@
static const size_t BATCHES = 101;
static const size_t BATCH_SIZE = 30;
static const int PREVECTOR_SIZE = 28;
static const unsigned int QUEUE_BATCH_SIZE = 128;
// This Benchmark tests the CheckQueue with a slightly realistic workload,

View file

@ -5,17 +5,20 @@
#include <prevector.h>
#include <bench/bench.h>
#include <script/script.h>
#include <serialize.h>
#include <streams.h>
#include <type_traits>
#include <vector>
struct nontrivial_t {
struct nontrivial_t
{
int x{-1};
nontrivial_t() = default;
SERIALIZE_METHODS(nontrivial_t, obj) { READWRITE(obj.x); }
};
static_assert(!std::is_trivially_default_constructible_v<nontrivial_t>,
"expected nontrivial_t to not be trivially constructible");
@ -27,22 +30,22 @@ template <typename T>
static void PrevectorDestructor(benchmark::Bench& bench)
{
bench.batch(2).run([&] {
prevector<28, T> t0;
prevector<28, T> t1;
t0.resize(28);
t1.resize(29);
prevector<PREVECTOR_SIZE, T> t0;
prevector<PREVECTOR_SIZE, T> t1;
t0.resize(PREVECTOR_SIZE);
t1.resize(PREVECTOR_SIZE + 1);
});
}
template <typename T>
static void PrevectorClear(benchmark::Bench& bench)
{
prevector<28, T> t0;
prevector<28, T> t1;
prevector<PREVECTOR_SIZE, T> t0;
prevector<PREVECTOR_SIZE, T> t1;
bench.batch(2).run([&] {
t0.resize(28);
t0.resize(PREVECTOR_SIZE);
t0.clear();
t1.resize(29);
t1.resize(PREVECTOR_SIZE + 1);
t1.clear();
});
}
@ -50,12 +53,12 @@ static void PrevectorClear(benchmark::Bench& bench)
template <typename T>
static void PrevectorResize(benchmark::Bench& bench)
{
prevector<28, T> t0;
prevector<28, T> t1;
prevector<PREVECTOR_SIZE, T> t0;
prevector<PREVECTOR_SIZE, T> t1;
bench.batch(4).run([&] {
t0.resize(28);
t0.resize(PREVECTOR_SIZE);
t0.resize(0);
t1.resize(29);
t1.resize(PREVECTOR_SIZE + 1);
t1.resize(0);
});
}
@ -64,8 +67,8 @@ template <typename T>
static void PrevectorDeserialize(benchmark::Bench& bench)
{
DataStream s0{};
prevector<28, T> t0;
t0.resize(28);
prevector<PREVECTOR_SIZE, T> t0;
t0.resize(PREVECTOR_SIZE);
for (auto x = 0; x < 900; ++x) {
s0 << t0;
}
@ -74,7 +77,7 @@ static void PrevectorDeserialize(benchmark::Bench& bench)
s0 << t0;
}
bench.batch(1000).run([&] {
prevector<28, T> t1;
prevector<PREVECTOR_SIZE, T> t1;
for (auto x = 0; x < 1000; ++x) {
s0 >> t1;
}
@ -86,7 +89,7 @@ template <typename T>
static void PrevectorFillVectorDirect(benchmark::Bench& bench)
{
bench.run([&] {
std::vector<prevector<28, T>> vec;
std::vector<prevector<PREVECTOR_SIZE, T>> vec;
vec.reserve(260);
for (size_t i = 0; i < 260; ++i) {
vec.emplace_back();
@ -99,11 +102,11 @@ template <typename T>
static void PrevectorFillVectorIndirect(benchmark::Bench& bench)
{
bench.run([&] {
std::vector<prevector<28, T>> vec;
std::vector<prevector<PREVECTOR_SIZE, T>> vec;
vec.reserve(260);
for (size_t i = 0; i < 260; ++i) {
// force allocation
vec.emplace_back(29, T{});
vec.emplace_back(PREVECTOR_SIZE + 1, T{});
}
});
}

View file

@ -4,7 +4,6 @@
#include <addresstype.h>
#include <bench/bench.h>
#include <bitcoin-build-config.h> // IWYU pragma: keep
#include <key.h>
#include <key_io.h>
#include <script/descriptor.h>
@ -26,7 +25,7 @@
#include <utility>
namespace wallet {
static void WalletIsMine(benchmark::Bench& bench, bool legacy_wallet, int num_combo = 0)
static void WalletIsMine(benchmark::Bench& bench, int num_combo = 0)
{
const auto test_setup = MakeNoLogFileContext<TestingSetup>();
@ -36,16 +35,13 @@ static void WalletIsMine(benchmark::Bench& bench, bool legacy_wallet, int num_co
// Setup the wallet
// Loading the wallet will also create it
uint64_t create_flags = 0;
if (!legacy_wallet) {
create_flags = WALLET_FLAG_DESCRIPTORS;
}
uint64_t create_flags = WALLET_FLAG_DESCRIPTORS;
auto database = CreateMockableWalletDatabase();
auto wallet = TestLoadWallet(std::move(database), context, create_flags);
// For a descriptor wallet, fill with num_combo combo descriptors with random keys
// This benchmarks a non-HD wallet migrated to descriptors
if (!legacy_wallet && num_combo > 0) {
if (num_combo > 0) {
LOCK(wallet->cs_wallet);
for (int i = 0; i < num_combo; ++i) {
CKey key;
@ -54,8 +50,8 @@ static void WalletIsMine(benchmark::Bench& bench, bool legacy_wallet, int num_co
std::string error;
std::vector<std::unique_ptr<Descriptor>> desc = Parse("combo(" + EncodeSecret(key) + ")", keys, error, /*require_checksum=*/false);
WalletDescriptor w_desc(std::move(desc.at(0)), /*creation_time=*/0, /*range_start=*/0, /*range_end=*/0, /*next_index=*/0);
auto spkm = wallet->AddWalletDescriptor(w_desc, keys, /*label=*/"", /*internal=*/false);
assert(spkm);
auto spk_manager = *Assert(wallet->AddWalletDescriptor(w_desc, keys, /*label=*/"", /*internal=*/false));
assert(spk_manager);
}
}
@ -70,13 +66,8 @@ static void WalletIsMine(benchmark::Bench& bench, bool legacy_wallet, int num_co
TestUnloadWallet(std::move(wallet));
}
#ifdef USE_BDB
static void WalletIsMineLegacy(benchmark::Bench& bench) { WalletIsMine(bench, /*legacy_wallet=*/true); }
BENCHMARK(WalletIsMineLegacy, benchmark::PriorityLevel::LOW);
#endif
static void WalletIsMineDescriptors(benchmark::Bench& bench) { WalletIsMine(bench, /*legacy_wallet=*/false); }
static void WalletIsMineMigratedDescriptors(benchmark::Bench& bench) { WalletIsMine(bench, /*legacy_wallet=*/false, /*num_combo=*/2000); }
static void WalletIsMineDescriptors(benchmark::Bench& bench) { WalletIsMine(bench); }
static void WalletIsMineMigratedDescriptors(benchmark::Bench& bench) { WalletIsMine(bench, /*num_combo=*/2000); }
BENCHMARK(WalletIsMineDescriptors, benchmark::PriorityLevel::LOW);
BENCHMARK(WalletIsMineMigratedDescriptors, benchmark::PriorityLevel::LOW);
} // namespace wallet

View file

@ -4,7 +4,6 @@
#include <addresstype.h>
#include <bench/bench.h>
#include <bitcoin-build-config.h> // IWYU pragma: keep
#include <consensus/amount.h>
#include <outputtype.h>
#include <primitives/transaction.h>
@ -32,7 +31,7 @@ static void AddTx(CWallet& wallet)
wallet.AddToWallet(MakeTransactionRef(mtx), TxStateInactive{});
}
static void WalletLoading(benchmark::Bench& bench, bool legacy_wallet)
static void WalletLoadingDescriptors(benchmark::Bench& bench)
{
const auto test_setup = MakeNoLogFileContext<TestingSetup>();
@ -42,10 +41,7 @@ static void WalletLoading(benchmark::Bench& bench, bool legacy_wallet)
// Setup the wallet
// Loading the wallet will also create it
uint64_t create_flags = 0;
if (!legacy_wallet) {
create_flags = WALLET_FLAG_DESCRIPTORS;
}
uint64_t create_flags = WALLET_FLAG_DESCRIPTORS;
auto database = CreateMockableWalletDatabase();
auto wallet = TestLoadWallet(std::move(database), context, create_flags);
@ -68,11 +64,5 @@ static void WalletLoading(benchmark::Bench& bench, bool legacy_wallet)
});
}
#ifdef USE_BDB
static void WalletLoadingLegacy(benchmark::Bench& bench) { WalletLoading(bench, /*legacy_wallet=*/true); }
BENCHMARK(WalletLoadingLegacy, benchmark::PriorityLevel::HIGH);
#endif
static void WalletLoadingDescriptors(benchmark::Bench& bench) { WalletLoading(bench, /*legacy_wallet=*/false); }
BENCHMARK(WalletLoadingDescriptors, benchmark::PriorityLevel::HIGH);
} // namespace wallet

View file

@ -3,14 +3,15 @@
// file COPYING or https://www.opensource.org/licenses/mit-license.php.
#include <bench/bench.h>
#include <kernel/chain.h>
#include <interfaces/chain.h>
#include <interfaces/wallet.h>
#include <kernel/chain.h>
#include <node/context.h>
#include <test/util/mining.h>
#include <test/util/setup_common.h>
#include <wallet/test/util.h>
#include <wallet/context.h>
#include <wallet/receive.h>
#include <wallet/test/util.h>
#include <wallet/wallet.h>
#include <optional>
@ -19,11 +20,8 @@ namespace wallet{
static void WalletMigration(benchmark::Bench& bench)
{
const auto test_setup = MakeNoLogFileContext<TestingSetup>();
WalletContext context;
context.args = &test_setup->m_args;
context.chain = test_setup->m_node.chain.get();
const auto test_setup{MakeNoLogFileContext<TestingSetup>()};
const auto loader{MakeWalletLoader(*test_setup->m_node.chain, test_setup->m_args)};
// Number of imported watch only addresses
int NUM_WATCH_ONLY_ADDR = 20;
@ -63,12 +61,13 @@ static void WalletMigration(benchmark::Bench& bench)
batch.WriteKey(pubkey, key.GetPrivKey(), CKeyMetadata());
}
bench.epochs(/*numEpochs=*/1).run([&context, &wallet] {
util::Result<MigrationResult> res = MigrateLegacyToDescriptor(std::move(wallet), /*passphrase=*/"", context, /*was_loaded=*/false);
assert(res);
assert(res->wallet);
assert(res->watchonly_wallet);
});
bench.epochs(/*numEpochs=*/1).epochIterations(/*numIters=*/1) // run the migration exactly once
.run([&] {
auto res{MigrateLegacyToDescriptor(std::move(wallet), /*passphrase=*/"", *loader->context(), /*was_loaded=*/false)};
assert(res);
assert(res->wallet);
assert(res->watchonly_wallet);
});
}
BENCHMARK(WalletMigration, benchmark::PriorityLevel::LOW);

View file

@ -1220,7 +1220,7 @@ static int CommandLineRPC(int argc, char *argv[])
if (gArgs.GetBoolArg("-stdinwalletpassphrase", false)) {
NO_STDIN_ECHO();
std::string walletPass;
if (args.size() < 1 || args[0].substr(0, 16) != "walletpassphrase") {
if (args.size() < 1 || !args[0].starts_with("walletpassphrase")) {
throw std::runtime_error("-stdinwalletpassphrase is only applicable for walletpassphrase(change)");
}
if (!StdinReady()) {

View file

@ -40,13 +40,11 @@ static void SetupWalletToolArgs(ArgsManager& argsman)
argsman.AddArg("-debug=<category>", "Output debugging information (default: 0).", ArgsManager::ALLOW_ANY, OptionsCategory::DEBUG_TEST);
argsman.AddArg("-descriptors", "Create descriptors wallet. Only for 'create'", ArgsManager::ALLOW_ANY, OptionsCategory::OPTIONS);
argsman.AddArg("-legacy", "Create legacy wallet. Only for 'create'", ArgsManager::ALLOW_ANY, OptionsCategory::OPTIONS);
argsman.AddArg("-format=<format>", "The format of the wallet file to create. Either \"bdb\" or \"sqlite\". Only used with 'createfromdump'", ArgsManager::ALLOW_ANY, OptionsCategory::OPTIONS);
argsman.AddArg("-printtoconsole", "Send trace/debug info to console (default: 1 when no -debug is true, 0 otherwise).", ArgsManager::ALLOW_ANY, OptionsCategory::DEBUG_TEST);
argsman.AddArg("-withinternalbdb", "Use the internal Berkeley DB parser when dumping a Berkeley DB wallet file (default: false)", ArgsManager::ALLOW_ANY, OptionsCategory::DEBUG_TEST);
argsman.AddCommand("info", "Get wallet info");
argsman.AddCommand("create", "Create new wallet file");
argsman.AddCommand("salvage", "Attempt to recover private keys from a corrupt wallet. Warning: 'salvage' is experimental.");
argsman.AddCommand("dump", "Print out all of the wallet key-value records");
argsman.AddCommand("createfromdump", "Create new wallet file from dumped records");
}

View file

@ -85,7 +85,7 @@ KeyInfo InterpretKey(std::string key)
result.section = key.substr(0, option_index);
key.erase(0, option_index + 1);
}
if (key.substr(0, 2) == "no") {
if (key.starts_with("no")) {
key.erase(0, 2);
result.negated = true;
}
@ -189,7 +189,7 @@ bool ArgsManager::ParseParameters(int argc, const char* const argv[], std::strin
// internet" warning, and clicks the Open button, macOS passes
// a unique process serial number (PSN) as -psn_... command-line
// argument, which we filter out.
if (key.substr(0, 5) == "-psn_") continue;
if (key.starts_with("-psn_")) continue;
#endif
if (key == "-") break; //bitcoin-tx using stdin

View file

@ -65,7 +65,7 @@ static bool GetConfigOptions(std::istream& stream, const std::string& filepath,
}
} else {
error = strprintf("parse error on line %i: %s", linenr, str);
if (str.size() >= 2 && str.substr(0, 2) == "no") {
if (str.size() >= 2 && str.starts_with("no")) {
error += strprintf(", if you intended to specify a negated option, use %s=1 instead", str);
}
return false;

View file

@ -83,7 +83,7 @@ CScript ParseScript(const std::string& s)
}
result << num.value();
} else if (w.substr(0, 2) == "0x" && w.size() > 2 && IsHex(std::string(w.begin() + 2, w.end()))) {
} else if (w.starts_with("0x") && w.size() > 2 && IsHex(std::string(w.begin() + 2, w.end()))) {
// Raw hex data, inserted NOT pushed onto stack:
std::vector<unsigned char> raw = ParseHex(std::string(w.begin() + 2, w.end()));
result.insert(result.end(), raw.begin(), raw.end());

View file

@ -134,7 +134,7 @@ static bool multiUserAuthorized(std::string strUserPass)
static bool RPCAuthorized(const std::string& strAuth, std::string& strAuthUsernameOut)
{
if (strAuth.substr(0, 6) != "Basic ")
if (!strAuth.starts_with("Basic "))
return false;
std::string_view strUserPass64 = TrimStringView(std::string_view{strAuth}.substr(6));
auto userpass_data = DecodeBase64(strUserPass64);

View file

@ -299,7 +299,7 @@ Session::Reply Session::SendRequestAndGetReply(const Sock& sock,
Reply reply;
// Don't log the full "SESSION CREATE ..." because it contains our private key.
reply.request = request.substr(0, 14) == "SESSION CREATE" ? "SESSION CREATE ..." : request;
reply.request = request.starts_with("SESSION CREATE") ? "SESSION CREATE ..." : request;
// It could take a few minutes for the I2P router to reply as it is querying the I2P network
// (when doing name lookup, for example). Notice: `RecvUntilTerminator()` is checking

View file

@ -624,11 +624,11 @@ void BlockManager::CleanupBlockRevFiles() const
const std::string path = fs::PathToString(it->path().filename());
if (fs::is_regular_file(*it) &&
path.length() == 12 &&
path.substr(8,4) == ".dat")
path.ends_with(".dat"))
{
if (path.substr(0, 3) == "blk") {
if (path.starts_with("blk")) {
mapBlockFiles[path.substr(3, 5)] = it->path();
} else if (path.substr(0, 3) == "rev") {
} else if (path.starts_with("rev")) {
remove(it->path());
}
}

View file

@ -139,7 +139,7 @@ bool PermittedDifficultyTransition(const Consensus::Params& params, int64_t heig
// the most significant bit of the last byte of the hash is set.
bool CheckProofOfWork(uint256 hash, unsigned int nBits, const Consensus::Params& params)
{
if constexpr (G_FUZZING) return (hash.data()[31] & 0x80) == 0;
if (EnableFuzzDeterminism()) return (hash.data()[31] & 0x80) == 0;
return CheckProofOfWorkImpl(hash, nBits, params);
}

View file

@ -468,7 +468,7 @@ bool SendCoinsDialog::signWithExternalSigner(PartiallySignedTransaction& psbtx,
return false;
}
if (err) {
tfm::format(std::cerr, "Failed to sign PSBT");
qWarning() << "Failed to sign PSBT";
processSendCoinsReturn(WalletModel::TransactionCreationFailed);
return false;
}
@ -855,6 +855,10 @@ void SendCoinsDialog::updateCoinControlState()
}
void SendCoinsDialog::updateNumberOfBlocks(int count, const QDateTime& blockDate, double nVerificationProgress, SyncType synctype, SynchronizationState sync_state) {
// During shutdown, clientModel will be nullptr. Attempting to update views at this point may cause a crash
// due to accessing backend models that might no longer exist.
if (!clientModel) return;
// Process event
if (sync_state == SynchronizationState::POST_INIT) {
updateSmartFeeLabel();
}

View file

@ -64,13 +64,13 @@ void RPCNestedTests::rpcNestedTests()
RPCConsole::RPCExecuteCommandLine(m_node, result, "getblock( getblock( getblock(getbestblockhash())[hash] )[hash], true)"); //4 level nesting with whitespace, filtering path and boolean parameter
RPCConsole::RPCExecuteCommandLine(m_node, result, "getblockchaininfo");
QVERIFY(result.substr(0,1) == "{");
QVERIFY(result.starts_with("{"));
RPCConsole::RPCExecuteCommandLine(m_node, result, "getblockchaininfo()");
QVERIFY(result.substr(0,1) == "{");
QVERIFY(result.starts_with("{"));
RPCConsole::RPCExecuteCommandLine(m_node, result, "getblockchaininfo "); //whitespace at the end will be tolerated
QVERIFY(result.substr(0,1) == "{");
QVERIFY(result.starts_with("{"));
RPCConsole::RPCExecuteCommandLine(m_node, result, "getblockchaininfo()[\"chain\"]"); //Quote path identifier are allowed, but look after a child containing the quotes in the key
QVERIFY(result == "null");

View file

@ -189,41 +189,34 @@ void SyncUpWallet(const std::shared_ptr<CWallet>& wallet, interfaces::Node& node
QVERIFY(result.last_failed_block.IsNull());
}
std::shared_ptr<CWallet> SetupLegacyWatchOnlyWallet(interfaces::Node& node, TestChain100Setup& test)
{
std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(node.context()->chain.get(), "", CreateMockableWalletDatabase());
wallet->LoadWallet();
{
LOCK(wallet->cs_wallet);
wallet->SetWalletFlag(WALLET_FLAG_DISABLE_PRIVATE_KEYS);
wallet->SetupLegacyScriptPubKeyMan();
// Add watched key
CPubKey pubKey = test.coinbaseKey.GetPubKey();
bool import_keys = wallet->ImportPubKeys({{pubKey.GetID(), false}}, {{pubKey.GetID(), pubKey}} , /*key_origins=*/{}, /*add_keypool=*/false, /*timestamp=*/1);
assert(import_keys);
wallet->SetLastBlockProcessed(105, WITH_LOCK(node.context()->chainman->GetMutex(), return node.context()->chainman->ActiveChain().Tip()->GetBlockHash()));
}
SyncUpWallet(wallet, node);
return wallet;
}
std::shared_ptr<CWallet> SetupDescriptorsWallet(interfaces::Node& node, TestChain100Setup& test)
std::shared_ptr<CWallet> SetupDescriptorsWallet(interfaces::Node& node, TestChain100Setup& test, bool watch_only = false)
{
std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(node.context()->chain.get(), "", CreateMockableWalletDatabase());
wallet->LoadWallet();
LOCK(wallet->cs_wallet);
wallet->SetWalletFlag(WALLET_FLAG_DESCRIPTORS);
wallet->SetupDescriptorScriptPubKeyMans();
if (watch_only) {
wallet->SetWalletFlag(WALLET_FLAG_DISABLE_PRIVATE_KEYS);
} else {
wallet->SetupDescriptorScriptPubKeyMans();
}
// Add the coinbase key
FlatSigningProvider provider;
std::string error;
auto descs = Parse("combo(" + EncodeSecret(test.coinbaseKey) + ")", provider, error, /* require_checksum=*/ false);
std::string key_str;
if (watch_only) {
key_str = HexStr(test.coinbaseKey.GetPubKey());
} else {
key_str = EncodeSecret(test.coinbaseKey);
}
auto descs = Parse("combo(" + key_str + ")", provider, error, /* require_checksum=*/ false);
assert(!descs.empty());
assert(descs.size() == 1);
auto& desc = descs.at(0);
WalletDescriptor w_desc(std::move(desc), 0, 0, 1, 1);
if (!wallet->AddWalletDescriptor(w_desc, provider, "", false)) assert(false);
auto spk_manager = *Assert(wallet->AddWalletDescriptor(w_desc, provider, "", false));
assert(spk_manager);
CTxDestination dest = GetDestinationForKey(test.coinbaseKey.GetPubKey(), wallet->m_default_address_type);
wallet->SetAddressBook(dest, "", wallet::AddressPurpose::RECEIVE);
wallet->SetLastBlockProcessed(105, WITH_LOCK(node.context()->chainman->GetMutex(), return node.context()->chainman->ActiveChain().Tip()->GetBlockHash()));
@ -300,8 +293,8 @@ void TestGUI(interfaces::Node& node, const std::shared_ptr<CWallet>& wallet)
QVERIFY(FindTx(*transactionTableModel, txid1).isValid());
QVERIFY(FindTx(*transactionTableModel, txid2).isValid());
// Call bumpfee. Test disabled, canceled, enabled, then failing cases.
BumpFee(transactionView, txid1, /*expectDisabled=*/true, /*expectError=*/"not BIP 125 replaceable", /*cancel=*/false);
// Call bumpfee. Test canceled fullrbf bump, canceled bip-125-rbf bump, passing bump, and then failing bump.
BumpFee(transactionView, txid1, /*expectDisabled=*/false, /*expectError=*/{}, /*cancel=*/true);
BumpFee(transactionView, txid2, /*expectDisabled=*/false, /*expectError=*/{}, /*cancel=*/true);
BumpFee(transactionView, txid2, /*expectDisabled=*/false, /*expectError=*/{}, /*cancel=*/false);
BumpFee(transactionView, txid2, /*expectDisabled=*/true, /*expectError=*/"already bumped", /*cancel=*/false);
@ -397,7 +390,7 @@ void TestGUI(interfaces::Node& node, const std::shared_ptr<CWallet>& wallet)
void TestGUIWatchOnly(interfaces::Node& node, TestChain100Setup& test)
{
const std::shared_ptr<CWallet>& wallet = SetupLegacyWatchOnlyWallet(node, test);
const std::shared_ptr<CWallet>& wallet = SetupDescriptorsWallet(node, test, /*watch_only=*/true);
// Create widgets and init models
std::unique_ptr<const PlatformStyle> platformStyle(PlatformStyle::instantiate("other"));
@ -409,7 +402,7 @@ void TestGUIWatchOnly(interfaces::Node& node, TestChain100Setup& test)
// Update walletModel cached balance which will trigger an update for the 'labelBalance' QLabel.
walletModel.pollBalanceChanged();
// Check balance in send dialog
CompareBalance(walletModel, walletModel.wallet().getBalances().watch_only_balance,
CompareBalance(walletModel, walletModel.wallet().getBalances().balance,
sendCoinsDialog.findChild<QLabel*>("labelBalance"));
// Set change address

View file

@ -174,22 +174,20 @@ public:
* Used by the Miniscript descriptors to check for duplicate keys in the script.
*/
bool operator<(PubkeyProvider& other) const {
CPubKey a, b;
SigningProvider dummy;
KeyOriginInfo dummy_info;
FlatSigningProvider dummy;
GetPubKey(0, dummy, a, dummy_info);
other.GetPubKey(0, dummy, b, dummy_info);
std::optional<CPubKey> a = GetPubKey(0, dummy, dummy);
std::optional<CPubKey> b = other.GetPubKey(0, dummy, dummy);
return a < b;
}
/** Derive a public key.
/** Derive a public key and put it into out.
* read_cache is the cache to read keys from (if not nullptr)
* write_cache is the cache to write keys to (if not nullptr)
* Caches are not exclusive but this is not tested. Currently we use them exclusively
*/
virtual bool GetPubKey(int pos, const SigningProvider& arg, CPubKey& key, KeyOriginInfo& info, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const = 0;
virtual std::optional<CPubKey> GetPubKey(int pos, const SigningProvider& arg, FlatSigningProvider& out, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const = 0;
/** Whether this represent multiple public keys at different positions. */
virtual bool IsRange() const = 0;
@ -213,8 +211,8 @@ public:
*/
virtual bool ToNormalizedString(const SigningProvider& arg, std::string& out, const DescriptorCache* cache = nullptr) const = 0;
/** Derive a private key, if private data is available in arg. */
virtual bool GetPrivKey(int pos, const SigningProvider& arg, CKey& key) const = 0;
/** Derive a private key, if private data is available in arg and put it into out. */
virtual void GetPrivKey(int pos, const SigningProvider& arg, FlatSigningProvider& out) const = 0;
/** Return the non-extended public key for this PubkeyProvider, if it has one. */
virtual std::optional<CPubKey> GetRootPubKey() const = 0;
@ -240,12 +238,16 @@ class OriginPubkeyProvider final : public PubkeyProvider
public:
OriginPubkeyProvider(uint32_t exp_index, KeyOriginInfo info, std::unique_ptr<PubkeyProvider> provider, bool apostrophe) : PubkeyProvider(exp_index), m_origin(std::move(info)), m_provider(std::move(provider)), m_apostrophe(apostrophe) {}
bool GetPubKey(int pos, const SigningProvider& arg, CPubKey& key, KeyOriginInfo& info, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
std::optional<CPubKey> GetPubKey(int pos, const SigningProvider& arg, FlatSigningProvider& out, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
{
if (!m_provider->GetPubKey(pos, arg, key, info, read_cache, write_cache)) return false;
std::copy(std::begin(m_origin.fingerprint), std::end(m_origin.fingerprint), info.fingerprint);
info.path.insert(info.path.begin(), m_origin.path.begin(), m_origin.path.end());
return true;
std::optional<CPubKey> pub = m_provider->GetPubKey(pos, arg, out, read_cache, write_cache);
if (!pub) return std::nullopt;
Assert(out.pubkeys.contains(pub->GetID()));
auto& [pubkey, suborigin] = out.origins[pub->GetID()];
Assert(pubkey == *pub); // m_provider must have a valid origin by this point.
std::copy(std::begin(m_origin.fingerprint), std::end(m_origin.fingerprint), suborigin.fingerprint);
suborigin.path.insert(suborigin.path.begin(), m_origin.path.begin(), m_origin.path.end());
return pub;
}
bool IsRange() const override { return m_provider->IsRange(); }
size_t GetSize() const override { return m_provider->GetSize(); }
@ -272,9 +274,9 @@ public:
}
return true;
}
bool GetPrivKey(int pos, const SigningProvider& arg, CKey& key) const override
void GetPrivKey(int pos, const SigningProvider& arg, FlatSigningProvider& out) const override
{
return m_provider->GetPrivKey(pos, arg, key);
m_provider->GetPrivKey(pos, arg, out);
}
std::optional<CPubKey> GetRootPubKey() const override
{
@ -296,24 +298,33 @@ class ConstPubkeyProvider final : public PubkeyProvider
CPubKey m_pubkey;
bool m_xonly;
std::optional<CKey> GetPrivKey(const SigningProvider& arg) const
{
CKey key;
if (!(m_xonly ? arg.GetKeyByXOnly(XOnlyPubKey(m_pubkey), key) :
arg.GetKey(m_pubkey.GetID(), key))) return std::nullopt;
return key;
}
public:
ConstPubkeyProvider(uint32_t exp_index, const CPubKey& pubkey, bool xonly) : PubkeyProvider(exp_index), m_pubkey(pubkey), m_xonly(xonly) {}
bool GetPubKey(int pos, const SigningProvider& arg, CPubKey& key, KeyOriginInfo& info, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
std::optional<CPubKey> GetPubKey(int pos, const SigningProvider&, FlatSigningProvider& out, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
{
key = m_pubkey;
info.path.clear();
KeyOriginInfo info;
CKeyID keyid = m_pubkey.GetID();
std::copy(keyid.begin(), keyid.begin() + sizeof(info.fingerprint), info.fingerprint);
return true;
out.origins.emplace(keyid, std::make_pair(m_pubkey, info));
out.pubkeys.emplace(keyid, m_pubkey);
return m_pubkey;
}
bool IsRange() const override { return false; }
size_t GetSize() const override { return m_pubkey.size(); }
std::string ToString(StringType type) const override { return m_xonly ? HexStr(m_pubkey).substr(2) : HexStr(m_pubkey); }
bool ToPrivateString(const SigningProvider& arg, std::string& ret) const override
{
CKey key;
if (!GetPrivKey(/*pos=*/0, arg, key)) return false;
ret = EncodeSecret(key);
std::optional<CKey> key = GetPrivKey(arg);
if (!key) return false;
ret = EncodeSecret(*key);
return true;
}
bool ToNormalizedString(const SigningProvider& arg, std::string& ret, const DescriptorCache* cache) const override
@ -321,10 +332,11 @@ public:
ret = ToString(StringType::PUBLIC);
return true;
}
bool GetPrivKey(int pos, const SigningProvider& arg, CKey& key) const override
void GetPrivKey(int pos, const SigningProvider& arg, FlatSigningProvider& out) const override
{
return m_xonly ? arg.GetKeyByXOnly(XOnlyPubKey(m_pubkey), key) :
arg.GetKey(m_pubkey.GetID(), key);
std::optional<CKey> key = GetPrivKey(arg);
if (!key) return;
out.keys.emplace(key->GetPubKey().GetID(), *key);
}
std::optional<CPubKey> GetRootPubKey() const override
{
@ -394,18 +406,14 @@ public:
BIP32PubkeyProvider(uint32_t exp_index, const CExtPubKey& extkey, KeyPath path, DeriveType derive, bool apostrophe) : PubkeyProvider(exp_index), m_root_extkey(extkey), m_path(std::move(path)), m_derive(derive), m_apostrophe(apostrophe) {}
bool IsRange() const override { return m_derive != DeriveType::NO; }
size_t GetSize() const override { return 33; }
bool GetPubKey(int pos, const SigningProvider& arg, CPubKey& key_out, KeyOriginInfo& final_info_out, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
std::optional<CPubKey> GetPubKey(int pos, const SigningProvider& arg, FlatSigningProvider& out, const DescriptorCache* read_cache = nullptr, DescriptorCache* write_cache = nullptr) const override
{
// Info of parent of the to be derived pubkey
KeyOriginInfo parent_info;
KeyOriginInfo info;
CKeyID keyid = m_root_extkey.pubkey.GetID();
std::copy(keyid.begin(), keyid.begin() + sizeof(parent_info.fingerprint), parent_info.fingerprint);
parent_info.path = m_path;
// Info of the derived key itself which is copied out upon successful completion
KeyOriginInfo final_info_out_tmp = parent_info;
if (m_derive == DeriveType::UNHARDENED) final_info_out_tmp.path.push_back((uint32_t)pos);
if (m_derive == DeriveType::HARDENED) final_info_out_tmp.path.push_back(((uint32_t)pos) | 0x80000000L);
std::copy(keyid.begin(), keyid.begin() + sizeof(info.fingerprint), info.fingerprint);
info.path = m_path;
if (m_derive == DeriveType::UNHARDENED) info.path.push_back((uint32_t)pos);
if (m_derive == DeriveType::HARDENED) info.path.push_back(((uint32_t)pos) | 0x80000000L);
// Derive keys or fetch them from cache
CExtPubKey final_extkey = m_root_extkey;
@ -414,16 +422,16 @@ public:
bool der = true;
if (read_cache) {
if (!read_cache->GetCachedDerivedExtPubKey(m_expr_index, pos, final_extkey)) {
if (m_derive == DeriveType::HARDENED) return false;
if (m_derive == DeriveType::HARDENED) return std::nullopt;
// Try to get the derivation parent
if (!read_cache->GetCachedParentExtPubKey(m_expr_index, parent_extkey)) return false;
if (!read_cache->GetCachedParentExtPubKey(m_expr_index, parent_extkey)) return std::nullopt;
final_extkey = parent_extkey;
if (m_derive == DeriveType::UNHARDENED) der = parent_extkey.Derive(final_extkey, pos);
}
} else if (IsHardened()) {
CExtKey xprv;
CExtKey lh_xprv;
if (!GetDerivedExtKey(arg, xprv, lh_xprv)) return false;
if (!GetDerivedExtKey(arg, xprv, lh_xprv)) return std::nullopt;
parent_extkey = xprv.Neuter();
if (m_derive == DeriveType::UNHARDENED) der = xprv.Derive(xprv, pos);
if (m_derive == DeriveType::HARDENED) der = xprv.Derive(xprv, pos | 0x80000000UL);
@ -433,16 +441,16 @@ public:
}
} else {
for (auto entry : m_path) {
if (!parent_extkey.Derive(parent_extkey, entry)) return false;
if (!parent_extkey.Derive(parent_extkey, entry)) return std::nullopt;
}
final_extkey = parent_extkey;
if (m_derive == DeriveType::UNHARDENED) der = parent_extkey.Derive(final_extkey, pos);
assert(m_derive != DeriveType::HARDENED);
}
if (!der) return false;
if (!der) return std::nullopt;
final_info_out = final_info_out_tmp;
key_out = final_extkey.pubkey;
out.origins.emplace(final_extkey.pubkey.GetID(), std::make_pair(final_extkey.pubkey, info));
out.pubkeys.emplace(final_extkey.pubkey.GetID(), final_extkey.pubkey);
if (write_cache) {
// Only cache parent if there is any unhardened derivation
@ -452,12 +460,12 @@ public:
if (last_hardened_extkey.pubkey.IsValid()) {
write_cache->CacheLastHardenedExtPubKey(m_expr_index, last_hardened_extkey);
}
} else if (final_info_out.path.size() > 0) {
} else if (info.path.size() > 0) {
write_cache->CacheDerivedExtPubKey(m_expr_index, pos, final_extkey);
}
}
return true;
return final_extkey.pubkey;
}
std::string ToString(StringType type, bool normalized) const
{
@ -543,15 +551,14 @@ public:
}
return true;
}
bool GetPrivKey(int pos, const SigningProvider& arg, CKey& key) const override
void GetPrivKey(int pos, const SigningProvider& arg, FlatSigningProvider& out) const override
{
CExtKey extkey;
CExtKey dummy;
if (!GetDerivedExtKey(arg, extkey, dummy)) return false;
if (m_derive == DeriveType::UNHARDENED && !extkey.Derive(extkey, pos)) return false;
if (m_derive == DeriveType::HARDENED && !extkey.Derive(extkey, pos | 0x80000000UL)) return false;
key = extkey.key;
return true;
if (!GetDerivedExtKey(arg, extkey, dummy)) return;
if (m_derive == DeriveType::UNHARDENED && !extkey.Derive(extkey, pos)) return;
if (m_derive == DeriveType::HARDENED && !extkey.Derive(extkey, pos | 0x80000000UL)) return;
out.keys.emplace(extkey.key.GetPubKey().GetID(), extkey.key);
}
std::optional<CPubKey> GetRootPubKey() const override
{
@ -700,16 +707,17 @@ public:
// NOLINTNEXTLINE(misc-no-recursion)
bool ExpandHelper(int pos, const SigningProvider& arg, const DescriptorCache* read_cache, std::vector<CScript>& output_scripts, FlatSigningProvider& out, DescriptorCache* write_cache) const
{
std::vector<std::pair<CPubKey, KeyOriginInfo>> entries;
entries.reserve(m_pubkey_args.size());
FlatSigningProvider subprovider;
std::vector<CPubKey> pubkeys;
pubkeys.reserve(m_pubkey_args.size());
// Construct temporary data in `entries`, `subscripts`, and `subprovider` to avoid producing output in case of failure.
// Construct temporary data in `pubkeys`, `subscripts`, and `subprovider` to avoid producing output in case of failure.
for (const auto& p : m_pubkey_args) {
entries.emplace_back();
if (!p->GetPubKey(pos, arg, entries.back().first, entries.back().second, read_cache, write_cache)) return false;
std::optional<CPubKey> pubkey = p->GetPubKey(pos, arg, subprovider, read_cache, write_cache);
if (!pubkey) return false;
pubkeys.push_back(pubkey.value());
}
std::vector<CScript> subscripts;
FlatSigningProvider subprovider;
for (const auto& subarg : m_subdescriptor_args) {
std::vector<CScript> outscripts;
if (!subarg->ExpandHelper(pos, arg, read_cache, outscripts, subprovider, write_cache)) return false;
@ -718,13 +726,6 @@ public:
}
out.Merge(std::move(subprovider));
std::vector<CPubKey> pubkeys;
pubkeys.reserve(entries.size());
for (auto& entry : entries) {
pubkeys.push_back(entry.first);
out.origins.emplace(entry.first.GetID(), std::make_pair<CPubKey, KeyOriginInfo>(CPubKey(entry.first), std::move(entry.second)));
}
output_scripts = MakeScripts(pubkeys, std::span{subscripts}, out);
return true;
}
@ -743,9 +744,7 @@ public:
void ExpandPrivate(int pos, const SigningProvider& provider, FlatSigningProvider& out) const final
{
for (const auto& p : m_pubkey_args) {
CKey key;
if (!p->GetPrivKey(pos, provider, key)) continue;
out.keys.emplace(key.GetPubKey().GetID(), key);
p->GetPrivKey(pos, provider, out);
}
for (const auto& arg : m_subdescriptor_args) {
arg->ExpandPrivate(pos, provider, out);
@ -800,6 +799,7 @@ public:
return OutputTypeFromDestination(m_destination);
}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
bool ToPrivateString(const SigningProvider& arg, std::string& out) const final { return false; }
std::optional<int64_t> ScriptSize() const override { return GetScriptForDestination(m_destination).size(); }
@ -827,6 +827,7 @@ public:
return OutputTypeFromDestination(dest);
}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
bool ToPrivateString(const SigningProvider& arg, std::string& out) const final { return false; }
std::optional<int64_t> ScriptSize() const override { return m_script.size(); }
@ -855,6 +856,7 @@ protected:
public:
PKDescriptor(std::unique_ptr<PubkeyProvider> prov, bool xonly = false) : DescriptorImpl(Vector(std::move(prov)), "pk"), m_xonly(xonly) {}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return true; }
std::optional<int64_t> ScriptSize() const override {
return 1 + (m_xonly ? 32 : m_pubkey_args[0]->GetSize()) + 1;
@ -881,16 +883,16 @@ public:
class PKHDescriptor final : public DescriptorImpl
{
protected:
std::vector<CScript> MakeScripts(const std::vector<CPubKey>& keys, std::span<const CScript>, FlatSigningProvider& out) const override
std::vector<CScript> MakeScripts(const std::vector<CPubKey>& keys, std::span<const CScript>, FlatSigningProvider&) const override
{
CKeyID id = keys[0].GetID();
out.pubkeys.emplace(id, keys[0]);
return Vector(GetScriptForDestination(PKHash(id)));
}
public:
PKHDescriptor(std::unique_ptr<PubkeyProvider> prov) : DescriptorImpl(Vector(std::move(prov)), "pkh") {}
std::optional<OutputType> GetOutputType() const override { return OutputType::LEGACY; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return true; }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 1 + 20 + 1 + 1; }
@ -915,16 +917,16 @@ public:
class WPKHDescriptor final : public DescriptorImpl
{
protected:
std::vector<CScript> MakeScripts(const std::vector<CPubKey>& keys, std::span<const CScript>, FlatSigningProvider& out) const override
std::vector<CScript> MakeScripts(const std::vector<CPubKey>& keys, std::span<const CScript>, FlatSigningProvider&) const override
{
CKeyID id = keys[0].GetID();
out.pubkeys.emplace(id, keys[0]);
return Vector(GetScriptForDestination(WitnessV0KeyHash(id)));
}
public:
WPKHDescriptor(std::unique_ptr<PubkeyProvider> prov) : DescriptorImpl(Vector(std::move(prov)), "wpkh") {}
std::optional<OutputType> GetOutputType() const override { return OutputType::BECH32; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return true; }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 20; }
@ -953,7 +955,6 @@ protected:
{
std::vector<CScript> ret;
CKeyID id = keys[0].GetID();
out.pubkeys.emplace(id, keys[0]);
ret.emplace_back(GetScriptForRawPubKey(keys[0])); // P2PK
ret.emplace_back(GetScriptForDestination(PKHash(id))); // P2PKH
if (keys[0].IsCompressed()) {
@ -967,6 +968,7 @@ protected:
public:
ComboDescriptor(std::unique_ptr<PubkeyProvider> prov) : DescriptorImpl(Vector(std::move(prov)), "combo") {}
bool IsSingleType() const final { return false; }
bool IsSingleKey() const final { return true; }
std::unique_ptr<DescriptorImpl> Clone() const override
{
return std::make_unique<ComboDescriptor>(m_pubkey_args.at(0)->Clone());
@ -991,6 +993,7 @@ protected:
public:
MultisigDescriptor(int threshold, std::vector<std::unique_ptr<PubkeyProvider>> providers, bool sorted = false) : DescriptorImpl(std::move(providers), sorted ? "sortedmulti" : "multi"), m_threshold(threshold), m_sorted(sorted) {}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
std::optional<int64_t> ScriptSize() const override {
const auto n_keys = m_pubkey_args.size();
@ -1042,6 +1045,7 @@ protected:
public:
MultiADescriptor(int threshold, std::vector<std::unique_ptr<PubkeyProvider>> providers, bool sorted = false) : DescriptorImpl(std::move(providers), sorted ? "sortedmulti_a" : "multi_a"), m_threshold(threshold), m_sorted(sorted) {}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
std::optional<int64_t> ScriptSize() const override {
const auto n_keys = m_pubkey_args.size();
@ -1088,6 +1092,7 @@ public:
return OutputType::LEGACY;
}
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return m_subdescriptor_args[0]->IsSingleKey(); }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 20 + 1; }
@ -1129,6 +1134,7 @@ public:
WSHDescriptor(std::unique_ptr<DescriptorImpl> desc) : DescriptorImpl({}, std::move(desc), "wsh") {}
std::optional<OutputType> GetOutputType() const override { return OutputType::BECH32; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return m_subdescriptor_args[0]->IsSingleKey(); }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 32; }
@ -1175,7 +1181,6 @@ protected:
builder.Finalize(xpk);
WitnessV1Taproot output = builder.GetOutput();
out.tr_trees[output] = builder;
out.pubkeys.emplace(keys[0].GetID(), keys[0]);
return Vector(GetScriptForDestination(output));
}
bool ToStringSubScriptHelper(const SigningProvider* arg, std::string& ret, const StringType type, const DescriptorCache* cache = nullptr) const override
@ -1207,6 +1212,7 @@ public:
}
std::optional<OutputType> GetOutputType() const override { return OutputType::BECH32M; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 32; }
@ -1334,6 +1340,7 @@ public:
bool IsSolvable() const override { return true; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
std::optional<int64_t> ScriptSize() const override { return m_node->ScriptSize(); }
@ -1373,6 +1380,7 @@ public:
RawTRDescriptor(std::unique_ptr<PubkeyProvider> output_key) : DescriptorImpl(Vector(std::move(output_key)), "rawtr") {}
std::optional<OutputType> GetOutputType() const override { return OutputType::BECH32M; }
bool IsSingleType() const final { return true; }
bool IsSingleKey() const final { return false; }
std::optional<int64_t> ScriptSize() const override { return 1 + 1 + 32; }

View file

@ -111,6 +111,11 @@ struct Descriptor {
/** Whether this descriptor will return one scriptPubKey or multiple (aka is or is not combo) */
virtual bool IsSingleType() const = 0;
/** Whether this descriptor only produces single key scripts (i.e. pk(), pkh(), wpkh(), sh() and wsh() nested of those, and combo())
* TODO: Remove this method once legacy wallets are removed as it is only necessary for importmulti.
*/
virtual bool IsSingleKey() const = 0;
/** Convert the descriptor to a private string. This fails if the provided provider does not have the relevant private keys. */
virtual bool ToPrivateString(const SigningProvider& provider, std::string& out) const = 0;

View file

@ -406,7 +406,8 @@ private:
* Tests in October 2015 showed use of this reduced dbcache memory usage by 23%
* and made an initial sync 13% faster.
*/
typedef prevector<28, unsigned char> CScriptBase;
static constexpr unsigned int PREVECTOR_SIZE{36};
typedef prevector<PREVECTOR_SIZE, unsigned char> CScriptBase;
bool GetScriptOp(CScriptBase::const_iterator& pc, CScriptBase::const_iterator end, opcodetype& opcodeRet, std::vector<unsigned char>* pvchRet);

View file

@ -117,4 +117,41 @@ BOOST_AUTO_TEST_CASE(num_chain_tx_max)
BOOST_CHECK_EQUAL(block_index.m_chain_tx_count, std::numeric_limits<uint64_t>::max());
}
BOOST_FIXTURE_TEST_CASE(invalidate_block, TestChain100Setup)
{
const CChain& active{*WITH_LOCK(Assert(m_node.chainman)->GetMutex(), return &Assert(m_node.chainman)->ActiveChain())};
// Check BlockStatus when doing InvalidateBlock()
BlockValidationState state;
auto* orig_tip = active.Tip();
int height_to_invalidate = orig_tip->nHeight - 10;
auto* tip_to_invalidate = active[height_to_invalidate];
m_node.chainman->ActiveChainstate().InvalidateBlock(state, tip_to_invalidate);
// tip_to_invalidate just got invalidated, so it's BLOCK_FAILED_VALID
WITH_LOCK(::cs_main, assert(tip_to_invalidate->nStatus & BLOCK_FAILED_VALID));
WITH_LOCK(::cs_main, assert((tip_to_invalidate->nStatus & BLOCK_FAILED_CHILD) == 0));
// check all ancestors of the invalidated block are validated up to BLOCK_VALID_TRANSACTIONS and are not invalid
auto pindex = tip_to_invalidate->pprev;
while (pindex) {
WITH_LOCK(::cs_main, assert(pindex->IsValid(BLOCK_VALID_TRANSACTIONS)));
WITH_LOCK(::cs_main, assert((pindex->nStatus & BLOCK_FAILED_MASK) == 0));
pindex = pindex->pprev;
}
// check all descendants of the invalidated block are BLOCK_FAILED_CHILD
pindex = orig_tip;
while (pindex && pindex != tip_to_invalidate) {
WITH_LOCK(::cs_main, assert((pindex->nStatus & BLOCK_FAILED_VALID) == 0));
WITH_LOCK(::cs_main, assert(pindex->nStatus & BLOCK_FAILED_CHILD));
pindex = pindex->pprev;
}
// don't mark already invalidated block (orig_tip is BLOCK_FAILED_CHILD) with BLOCK_FAILED_VALID again
m_node.chainman->ActiveChainstate().InvalidateBlock(state, orig_tip);
WITH_LOCK(::cs_main, assert(orig_tip->nStatus & BLOCK_FAILED_CHILD));
WITH_LOCK(::cs_main, assert((orig_tip->nStatus & BLOCK_FAILED_VALID) == 0));
}
BOOST_AUTO_TEST_SUITE_END()

View file

@ -2610,6 +2610,63 @@
[["645168", 0.00000001], "0x22 0x0020f913eacf2e38a5d6fc3a8311d72ae704cb83866350a984dd3e5eb76d2a8c28e8", "HASH160 0x14 0xdbb7d1c0a56b7a9c423300c8cca6e6e065baf1dc EQUAL", "P2SH,WITNESS", "UNBALANCED_CONDITIONAL"],
[["645168", 0.00000001], "0x22 0x0020f913eacf2e38a5d6fc3a8311d72ae704cb83866350a984dd3e5eb76d2a8c28e8", "HASH160 0x14 0xdbb7d1c0a56b7a9c423300c8cca6e6e065baf1dc EQUAL", "P2SH,WITNESS,MINIMALIF", "UNBALANCED_CONDITIONAL"],
["Tapscript tests"],
[
[
"1ffe1234567890",
"00",
"#SCRIPT# HASH256 DUP SHA1 DROP DUP DROP TOALTSTACK HASH256 DUP DROP TOALTSTACK FROMALTSTACK",
"#CONTROLBLOCK#",
0.00000001
],
"",
"0x51 0x20 #TAPROOTOUTPUT#",
"P2SH,WITNESS,TAPROOT",
"OK",
"TAPSCRIPT Tests testing tapscript with many different op codes including ALTSTACK interactions"
],
[
[
"abcdef",
"#SCRIPT# 1 IF SHA256 ENDIF SIZE SWAP DROP 32 EQUAL",
"#CONTROLBLOCK#",
0.00000001
],
"",
"0x51 0x20 #TAPROOTOUTPUT#",
"P2SH,WITNESS,TAPROOT",
"OK",
"TAPSCRIPT Test IF conditional when true"
],
[
[
"abcdef",
"#SCRIPT# 0 IF SHA256 ENDIF SIZE SWAP DROP 32 EQUAL",
"#CONTROLBLOCK#",
0.00000001
],
"",
"0x51 0x20 #TAPROOTOUTPUT#",
"P2SH,WITNESS,TAPROOT",
"EVAL_FALSE",
"TAPSCRIPT Test IF conditional when false"
],
[
[
"aa",
"bb",
"cc",
"#SCRIPT# EQUAL IF DROP DROP ENDIF",
"#CONTROLBLOCK#",
0.00000001
],
"",
"0x51 0x20 #TAPROOTOUTPUT#",
"P2SH,WITNESS,TAPROOT",
"OK",
"TAPSCRIPT Test that DROP operations do not execute inside of a false IF conditional"
],
["NULLFAIL should cover all signatures and signatures only"],
["0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0", "0x01 0x14 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0x01 0x14 CHECKMULTISIG NOT", "DERSIG", "OK", "BIP66 and NULLFAIL-compliant"],
["0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0", "0x01 0x14 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0x01 0x14 CHECKMULTISIG NOT", "DERSIG,NULLFAIL", "OK", "BIP66 and NULLFAIL-compliant"],

View file

@ -149,10 +149,18 @@ static void initialize()
std::cerr << "No fuzz target compiled for " << g_fuzz_target << "." << std::endl;
std::exit(EXIT_FAILURE);
}
if constexpr (!G_FUZZING) {
std::cerr << "Must compile with -DBUILD_FOR_FUZZING=ON to execute a fuzz target." << std::endl;
if constexpr (!G_FUZZING_BUILD && !G_ABORT_ON_FAILED_ASSUME) {
std::cerr << "Must compile with -DBUILD_FOR_FUZZING=ON or in Debug mode to execute a fuzz target." << std::endl;
std::exit(EXIT_FAILURE);
}
if (!EnableFuzzDeterminism()) {
if (std::getenv("FUZZ_NONDETERMINISM")) {
std::cerr << "Warning: FUZZ_NONDETERMINISM env var set, results may be inconsistent with fuzz build" << std::endl;
} else {
g_enable_dynamic_fuzz_determinism = true;
assert(EnableFuzzDeterminism());
}
}
Assert(!g_test_one_input);
g_test_one_input = &it->second.test_one_input;
it->second.opts.init();

View file

@ -59,7 +59,7 @@ struct FuzzedWallet {
WalletDescriptor w_desc{std::move(parsed_desc), /*creation_time=*/0, /*range_start=*/0, /*range_end=*/1, /*next_index=*/0};
assert(!wallet->GetDescriptorScriptPubKeyMan(w_desc));
LOCK(wallet->cs_wallet);
auto spk_manager{wallet->AddWalletDescriptor(w_desc, keys, /*label=*/"", internal)};
auto spk_manager = *Assert(wallet->AddWalletDescriptor(w_desc, keys, /*label=*/"", internal));
assert(spk_manager);
wallet->AddActiveScriptPubKeyMan(spk_manager->GetID(), *Assert(w_desc.descriptor->GetOutputType()), internal);
}

View file

@ -916,16 +916,38 @@ BOOST_AUTO_TEST_CASE(script_json_test)
// amount (nValue) to use in the crediting tx
UniValue tests = read_json(json_tests::script_tests);
const KeyData keys;
for (unsigned int idx = 0; idx < tests.size(); idx++) {
const UniValue& test = tests[idx];
std::string strTest = test.write();
CScriptWitness witness;
TaprootBuilder taprootBuilder;
CAmount nValue = 0;
unsigned int pos = 0;
if (test.size() > 0 && test[pos].isArray()) {
unsigned int i=0;
for (i = 0; i < test[pos].size()-1; i++) {
witness.stack.push_back(ParseHex(test[pos][i].get_str()));
auto element = test[pos][i].get_str();
// We use #SCRIPT# to flag a non-hex script that we can read using ParseScript
// Taproot script must be third from the last element in witness stack
static const std::string SCRIPT_FLAG{"#SCRIPT#"};
if (element.starts_with(SCRIPT_FLAG)) {
CScript script = ParseScript(element.substr(SCRIPT_FLAG.size()));
witness.stack.push_back(ToByteVector(script));
} else if (element == "#CONTROLBLOCK#") {
// Taproot script control block - second from the last element in witness stack
// If #CONTROLBLOCK# we auto-generate the control block
taprootBuilder.Add(/*depth=*/0, witness.stack.back(), TAPROOT_LEAF_TAPSCRIPT, /*track=*/true);
taprootBuilder.Finalize(XOnlyPubKey(keys.key0.GetPubKey()));
auto controlblocks = taprootBuilder.GetSpendData().scripts[{witness.stack.back(), TAPROOT_LEAF_TAPSCRIPT}];
witness.stack.push_back(*(controlblocks.begin()));
} else {
const auto witness_value{TryParseHex<unsigned char>(element)};
if (!witness_value.has_value()) {
BOOST_ERROR("Bad witness in test: " << strTest << " witness is not hex: " << element);
}
witness.stack.push_back(witness_value.value());
}
}
nValue = AmountFromValue(test[pos][i]);
pos++;
@ -940,7 +962,14 @@ BOOST_AUTO_TEST_CASE(script_json_test)
std::string scriptSigString = test[pos++].get_str();
CScript scriptSig = ParseScript(scriptSigString);
std::string scriptPubKeyString = test[pos++].get_str();
CScript scriptPubKey = ParseScript(scriptPubKeyString);
CScript scriptPubKey;
// If requested, auto-generate the taproot output
if (scriptPubKeyString == "0x51 0x20 #TAPROOTOUTPUT#") {
BOOST_CHECK_MESSAGE(taprootBuilder.IsComplete(), "Failed to autogenerate Tapscript output key");
scriptPubKey = CScript() << OP_1 << ToByteVector(taprootBuilder.GetOutput());
} else {
scriptPubKey = ParseScript(scriptPubKeyString);
}
unsigned int scriptflags = ParseScriptFlags(test[pos++].get_str());
int scriptError = ParseScriptError(test[pos++].get_str());
@ -1129,6 +1158,91 @@ BOOST_AUTO_TEST_CASE(script_CHECKMULTISIG23)
BOOST_CHECK_MESSAGE(err == SCRIPT_ERR_INVALID_STACK_OPERATION, ScriptErrorString(err));
}
BOOST_AUTO_TEST_CASE(script_size_and_capacity_test)
{
BOOST_CHECK_EQUAL(sizeof(prevector<34, uint8_t>), sizeof(prevector<PREVECTOR_SIZE, uint8_t>));
BOOST_CHECK_EQUAL(sizeof(CScriptBase), 40);
BOOST_CHECK_EQUAL(sizeof(CScript), 40);
BOOST_CHECK_EQUAL(sizeof(CTxOut), 48);
CKey dummyKey;
dummyKey.MakeNewKey(true);
std::vector<std::vector<uint8_t>> dummyVSolutions;
// Small OP_RETURN has direct allocation
{
const auto scriptSmallOpReturn{CScript() << OP_RETURN << std::vector<uint8_t>(10, 0xaa)};
BOOST_CHECK_EQUAL(Solver(scriptSmallOpReturn, dummyVSolutions), TxoutType::NULL_DATA);
BOOST_CHECK_EQUAL(scriptSmallOpReturn.size(), 12);
BOOST_CHECK_EQUAL(scriptSmallOpReturn.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptSmallOpReturn.allocated_memory(), 0);
}
// P2WPKH has direct allocation
{
const auto scriptP2WPKH{GetScriptForDestination(WitnessV0KeyHash{PKHash{CKeyID{CPubKey{dummyKey.GetPubKey()}.GetID()}}})};
BOOST_CHECK_EQUAL(Solver(scriptP2WPKH, dummyVSolutions), TxoutType::WITNESS_V0_KEYHASH);
BOOST_CHECK_EQUAL(scriptP2WPKH.size(), 22);
BOOST_CHECK_EQUAL(scriptP2WPKH.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptP2WPKH.allocated_memory(), 0);
}
// P2SH has direct allocation
{
const auto scriptP2SH{GetScriptForDestination(ScriptHash{CScript{} << OP_TRUE})};
BOOST_CHECK(scriptP2SH.IsPayToScriptHash());
BOOST_CHECK_EQUAL(scriptP2SH.size(), 23);
BOOST_CHECK_EQUAL(scriptP2SH.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptP2SH.allocated_memory(), 0);
}
// P2PKH has direct allocation
{
const auto scriptP2PKH{GetScriptForDestination(PKHash{CKeyID{CPubKey{dummyKey.GetPubKey()}.GetID()}})};
BOOST_CHECK_EQUAL(Solver(scriptP2PKH, dummyVSolutions), TxoutType::PUBKEYHASH);
BOOST_CHECK_EQUAL(scriptP2PKH.size(), 25);
BOOST_CHECK_EQUAL(scriptP2PKH.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptP2PKH.allocated_memory(), 0);
}
// P2WSH has direct allocation
{
const auto scriptP2WSH{GetScriptForDestination(WitnessV0ScriptHash{CScript{} << OP_TRUE})};
BOOST_CHECK(scriptP2WSH.IsPayToWitnessScriptHash());
BOOST_CHECK_EQUAL(scriptP2WSH.size(), 34);
BOOST_CHECK_EQUAL(scriptP2WSH.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptP2WSH.allocated_memory(), 0);
}
// P2TR has direct allocation
{
const auto scriptTaproot{GetScriptForDestination(WitnessV1Taproot{XOnlyPubKey{CPubKey{dummyKey.GetPubKey()}}})};
BOOST_CHECK_EQUAL(Solver(scriptTaproot, dummyVSolutions), TxoutType::WITNESS_V1_TAPROOT);
BOOST_CHECK_EQUAL(scriptTaproot.size(), 34);
BOOST_CHECK_EQUAL(scriptTaproot.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptTaproot.allocated_memory(), 0);
}
// P2PK has direct allocation
{
const auto scriptPubKey{GetScriptForRawPubKey(CPubKey{dummyKey.GetPubKey()})};
BOOST_CHECK_EQUAL(Solver(scriptPubKey, dummyVSolutions), TxoutType::PUBKEY);
BOOST_CHECK_EQUAL(scriptPubKey.size(), 35);
BOOST_CHECK_EQUAL(scriptPubKey.capacity(), PREVECTOR_SIZE);
BOOST_CHECK_EQUAL(scriptPubKey.allocated_memory(), 0);
}
// MULTISIG needs extra allocation
{
const auto scriptMultisig{GetScriptForMultisig(1, std::vector{2, CPubKey{dummyKey.GetPubKey()}})};
BOOST_CHECK_EQUAL(Solver(scriptMultisig, dummyVSolutions), TxoutType::MULTISIG);
BOOST_CHECK_EQUAL(scriptMultisig.size(), 71);
BOOST_CHECK_EQUAL(scriptMultisig.capacity(), 103);
BOOST_CHECK_EQUAL(scriptMultisig.allocated_memory(), 103);
}
}
/* Wrapper around ProduceSignature to combine two scriptsigs */
SignatureData CombineSignatures(const CTxOut& txout, const CMutableTransaction& tx, const SignatureData& scriptSig1, const SignatureData& scriptSig2)
{

View file

@ -25,7 +25,7 @@ void SeedRandomStateForTest(SeedRand seedtype)
// no longer truly random. It should be enough to get the seed once for the
// process.
static const auto g_ctx_seed = []() -> std::optional<uint256> {
if constexpr (G_FUZZING) return {};
if (EnableFuzzDeterminism()) return {};
// If RANDOM_CTX_SEED is set, use that as seed.
if (const char* num{std::getenv(RANDOM_CTX_SEED)}) {
if (auto num_parsed{uint256::FromUserHex(num)}) {
@ -40,7 +40,7 @@ void SeedRandomStateForTest(SeedRand seedtype)
}();
g_seeded_g_prng_zero = seedtype == SeedRand::ZEROS;
if constexpr (G_FUZZING) {
if (EnableFuzzDeterminism()) {
Assert(g_seeded_g_prng_zero); // Only SeedRandomStateForTest(SeedRand::ZEROS) is allowed in fuzz tests
Assert(!g_used_g_prng); // The global PRNG must not have been used before SeedRandomStateForTest(SeedRand::ZEROS)
}

View file

@ -112,7 +112,7 @@ static void ExitFailure(std::string_view str_err)
BasicTestingSetup::BasicTestingSetup(const ChainType chainType, TestOpts opts)
: m_args{}
{
if constexpr (!G_FUZZING) {
if (!EnableFuzzDeterminism()) {
SeedRandomForTest(SeedRand::FIXED_SEED);
}
m_node.shutdown_signal = &m_interrupt;
@ -203,7 +203,7 @@ BasicTestingSetup::~BasicTestingSetup()
{
m_node.ecc_context.reset();
m_node.kernel.reset();
if constexpr (!G_FUZZING) {
if (!EnableFuzzDeterminism()) {
SetMockTime(0s); // Reset mocktime for following tests
}
LogInstance().DisconnectTestLogger();
@ -229,8 +229,9 @@ ChainTestingSetup::ChainTestingSetup(const ChainType chainType, TestOpts opts)
m_node.scheduler->m_service_thread = std::thread(util::TraceThread, "scheduler", [&] { m_node.scheduler->serviceQueue(); });
m_node.validation_signals =
// Use synchronous task runner while fuzzing to avoid non-determinism
G_FUZZING ? std::make_unique<ValidationSignals>(std::make_unique<util::ImmediateTaskRunner>()) :
std::make_unique<ValidationSignals>(std::make_unique<SerialTaskRunner>(*m_node.scheduler));
EnableFuzzDeterminism() ?
std::make_unique<ValidationSignals>(std::make_unique<util::ImmediateTaskRunner>()) :
std::make_unique<ValidationSignals>(std::make_unique<SerialTaskRunner>(*m_node.scheduler));
{
// Ensure deterministic coverage by waiting for m_service_thread to be running
std::promise<void> promise;
@ -255,7 +256,7 @@ ChainTestingSetup::ChainTestingSetup(const ChainType chainType, TestOpts opts)
.notifications = *m_node.notifications,
.signals = m_node.validation_signals.get(),
// Use no worker threads while fuzzing to avoid non-determinism
.worker_threads_num = G_FUZZING ? 0 : 2,
.worker_threads_num = EnableFuzzDeterminism() ? 0 : 2,
};
if (opts.min_validation_cache) {
chainman_opts.script_execution_cache_bytes = 0;

View file

@ -567,9 +567,9 @@ BOOST_AUTO_TEST_CASE(strprintf_numbers)
#undef B
#undef E
BOOST_AUTO_TEST_CASE(util_time_GetTime)
BOOST_AUTO_TEST_CASE(util_mocktime)
{
SetMockTime(111);
SetMockTime(111s);
// Check that mock time does not change after a sleep
for (const auto& num_sleep : {0ms, 1ms}) {
UninterruptibleSleep(num_sleep);
@ -583,13 +583,6 @@ BOOST_AUTO_TEST_CASE(util_time_GetTime)
BOOST_CHECK_EQUAL(111000000, GetTime<std::chrono::microseconds>().count());
}
SetMockTime(0s);
// Check that steady time changes after a sleep
const auto steady_ms_0 = Now<SteadyMilliseconds>();
const auto steady_0 = std::chrono::steady_clock::now();
UninterruptibleSleep(1ms);
BOOST_CHECK(steady_ms_0 < Now<SteadyMilliseconds>());
BOOST_CHECK(steady_0 + 1ms <= std::chrono::steady_clock::now());
}
BOOST_AUTO_TEST_CASE(test_IsDigit)

View file

@ -26,9 +26,8 @@ BOOST_AUTO_TEST_CASE(getcoinscachesizestate)
LOCK(::cs_main);
auto& view = chainstate.CoinsTip();
// The number of bytes consumed by coin's heap data, i.e. CScript
// (prevector<28, unsigned char>) when assigned 56 bytes of data per above.
//
// The number of bytes consumed by coin's heap data, i.e.
// CScript (prevector<36, unsigned char>) when assigned 56 bytes of data per above.
// See also: Coin::DynamicMemoryUsage().
constexpr unsigned int COIN_SIZE = is_64_bit ? 80 : 64;

View file

@ -33,3 +33,5 @@ void assertion_fail(std::string_view file, int line, std::string_view func, std:
fwrite(str.data(), 1, str.size(), stderr);
std::abort();
}
std::atomic<bool> g_enable_dynamic_fuzz_determinism{false};

View file

@ -7,19 +7,44 @@
#include <attributes.h>
#include <atomic>
#include <cassert> // IWYU pragma: export
#include <stdexcept>
#include <string>
#include <string_view>
#include <utility>
constexpr bool G_FUZZING{
constexpr bool G_FUZZING_BUILD{
#ifdef FUZZING_BUILD_MODE_UNSAFE_FOR_PRODUCTION
true
#else
false
#endif
};
constexpr bool G_ABORT_ON_FAILED_ASSUME{
#ifdef ABORT_ON_FAILED_ASSUME
true
#else
false
#endif
};
extern std::atomic<bool> g_enable_dynamic_fuzz_determinism;
inline bool EnableFuzzDeterminism()
{
if constexpr (G_FUZZING_BUILD) {
return true;
} else if constexpr (!G_ABORT_ON_FAILED_ASSUME) {
// Running fuzz tests is always disabled if Assume() doesn't abort
// (ie, non-fuzz non-debug builds), as otherwise tests which
// should fail due to a failing Assume may still pass. As such,
// we also statically disable fuzz determinism in that case.
return false;
} else {
return g_enable_dynamic_fuzz_determinism;
}
}
std::string StrFormatInternalBug(std::string_view msg, std::string_view file, int line, std::string_view func);
@ -50,11 +75,7 @@ void assertion_fail(std::string_view file, int line, std::string_view func, std:
template <bool IS_ASSERT, typename T>
constexpr T&& inline_assertion_check(LIFETIMEBOUND T&& val, [[maybe_unused]] const char* file, [[maybe_unused]] int line, [[maybe_unused]] const char* func, [[maybe_unused]] const char* assertion)
{
if (IS_ASSERT || std::is_constant_evaluated() || G_FUZZING
#ifdef ABORT_ON_FAILED_ASSUME
|| true
#endif
) {
if (IS_ASSERT || std::is_constant_evaluated() || G_FUZZING_BUILD || G_ABORT_ON_FAILED_ASSUME) {
if (!val) {
assertion_fail(file, line, func, assertion);
}

View file

@ -168,7 +168,7 @@ std::vector<T> Split(const std::span<const char>& sp, char sep)
[[nodiscard]] inline std::string_view RemovePrefixView(std::string_view str, std::string_view prefix)
{
if (str.substr(0, prefix.size()) == prefix) {
if (str.starts_with(prefix)) {
return str.substr(prefix.size());
}
return str;

View file

@ -3747,7 +3747,7 @@ bool Chainstate::InvalidateBlock(BlockValidationState& state, CBlockIndex* pinde
m_blockman.m_dirty_blockindex.insert(invalid_walk_tip);
setBlockIndexCandidates.erase(invalid_walk_tip);
setBlockIndexCandidates.insert(invalid_walk_tip->pprev);
if (invalid_walk_tip->pprev == to_mark_failed && (to_mark_failed->nStatus & BLOCK_FAILED_VALID)) {
if (invalid_walk_tip == to_mark_failed->pprev && (to_mark_failed->nStatus & BLOCK_FAILED_VALID)) {
// We only want to mark the last disconnected block as BLOCK_FAILED_VALID; its children
// need to be BLOCK_FAILED_CHILD instead.
to_mark_failed->nStatus = (to_mark_failed->nStatus ^ BLOCK_FAILED_VALID) | BLOCK_FAILED_CHILD;
@ -3779,11 +3779,13 @@ bool Chainstate::InvalidateBlock(BlockValidationState& state, CBlockIndex* pinde
return false;
}
// Mark pindex (or the last disconnected block) as invalid, even when it never was in the main chain
to_mark_failed->nStatus |= BLOCK_FAILED_VALID;
m_blockman.m_dirty_blockindex.insert(to_mark_failed);
setBlockIndexCandidates.erase(to_mark_failed);
m_chainman.m_failed_blocks.insert(to_mark_failed);
// Mark pindex as invalid if it never was in the main chain
if (!pindex_was_in_chain && !(pindex->nStatus & BLOCK_FAILED_MASK)) {
pindex->nStatus |= BLOCK_FAILED_VALID;
m_blockman.m_dirty_blockindex.insert(pindex);
setBlockIndexCandidates.erase(pindex);
m_chainman.m_failed_blocks.insert(pindex);
}
// If any new blocks somehow arrived while we were disconnecting
// (above), then the pre-calculation of what should go into
@ -3826,8 +3828,9 @@ void Chainstate::SetBlockFailureFlags(CBlockIndex* invalid_block)
AssertLockHeld(cs_main);
for (auto& [_, block_index] : m_blockman.m_block_index) {
if (block_index.GetAncestor(invalid_block->nHeight) == invalid_block && !(block_index.nStatus & BLOCK_FAILED_MASK)) {
block_index.nStatus |= BLOCK_FAILED_CHILD;
if (invalid_block != &block_index && block_index.GetAncestor(invalid_block->nHeight) == invalid_block) {
block_index.nStatus = (block_index.nStatus & ~BLOCK_FAILED_VALID) | BLOCK_FAILED_CHILD;
m_blockman.m_dirty_blockindex.insert(&block_index);
}
}
}

View file

@ -46,6 +46,6 @@ target_link_libraries(bitcoin_wallet
)
if(USE_BDB)
target_sources(bitcoin_wallet PRIVATE bdb.cpp salvage.cpp)
target_sources(bitcoin_wallet PRIVATE bdb.cpp)
target_link_libraries(bitcoin_wallet PUBLIC BerkeleyDB::BerkeleyDB)
endif()

View file

@ -182,7 +182,6 @@ public:
};
enum class DatabaseFormat {
BERKELEY,
SQLITE,
BERKELEY_RO,
BERKELEY_SWAP,

View file

@ -175,26 +175,14 @@ bool CreateFromDump(const ArgsManager& args, const std::string& name, const fs::
dump_file.close();
return false;
}
// Get the data file format with format_value as the default
std::string file_format = args.GetArg("-format", format_value);
if (file_format.empty()) {
error = _("No wallet file format provided. To use createfromdump, -format=<format> must be provided.");
// Make sure that the dump was created from a sqlite database only as that is the only
// type of database that we still support.
// Other formats such as BDB should not be loaded into a sqlite database since they also
// use a different type of wallet entirely which is no longer compatible with this software.
if (format_value != "sqlite") {
error = strprintf(_("Error: Dumpfile specifies an unsupported database format (%s). Only sqlite database dumps are supported"), format_value);
return false;
}
DatabaseFormat data_format;
if (file_format == "bdb") {
data_format = DatabaseFormat::BERKELEY;
} else if (file_format == "sqlite") {
data_format = DatabaseFormat::SQLITE;
} else if (file_format == "bdb_swap") {
data_format = DatabaseFormat::BERKELEY_SWAP;
} else {
error = strprintf(_("Unknown wallet file format \"%s\" provided. Please provide one of \"bdb\" or \"sqlite\"."), file_format);
return false;
}
if (file_format != format_value) {
warnings.push_back(strprintf(_("Warning: Dumpfile wallet format \"%s\" does not match command line specified format \"%s\"."), format_value, file_format));
}
std::string format_hasher_line = strprintf("%s,%s\n", format_key, format_value);
hasher << std::span{format_hasher_line};
@ -202,7 +190,7 @@ bool CreateFromDump(const ArgsManager& args, const std::string& name, const fs::
DatabaseStatus status;
ReadDatabaseArgs(args, options);
options.require_create = true;
options.require_format = data_format;
options.require_format = DatabaseFormat::SQLITE;
std::unique_ptr<WalletDatabase> database = MakeDatabase(wallet_path, options, status, error);
if (!database) return false;

View file

@ -40,11 +40,6 @@ static feebumper::Result PreconditionChecks(const CWallet& wallet, const CWallet
return feebumper::Result::WALLET_ERROR;
}
if (!SignalsOptInRBF(*wtx.tx)) {
errors.emplace_back(Untranslated("Transaction is not BIP 125 replaceable"));
return feebumper::Result::WALLET_ERROR;
}
if (wtx.mapValue.count("replaced_by_txid")) {
errors.push_back(Untranslated(strprintf("Cannot bump transaction %s which was already bumped by transaction %s", wtx.GetHash().ToString(), wtx.mapValue.at("replaced_by_txid"))));
return feebumper::Result::WALLET_ERROR;

View file

@ -98,11 +98,6 @@ void WalletInit::AddWalletOptions(ArgsManager& argsman) const
bool WalletInit::ParameterInteraction() const
{
#ifdef USE_BDB
if (!BerkeleyDatabaseSanityCheck()) {
return InitError(Untranslated("A version conflict was detected between the run-time BerkeleyDB library and the one used during compilation."));
}
#endif
if (gArgs.GetBoolArg("-disablewallet", DEFAULT_DISABLE_WALLET)) {
for (const std::string& wallet : gArgs.GetArgs("-wallet")) {
LogPrintf("%s: parameter interaction: -disablewallet -> ignoring -wallet=%s\n", __func__, wallet);

View file

@ -559,7 +559,7 @@ RPCHelpMan importwallet()
fLabel = false;
if (vstr[nStr] == "reserve=1")
fLabel = false;
if (vstr[nStr].substr(0,6) == "label=") {
if (vstr[nStr].starts_with("label=")) {
strLabel = DecodeDumpString(vstr[nStr].substr(6));
fLabel = true;
}
@ -1091,6 +1091,9 @@ static UniValue ProcessImportDescriptor(ImportData& import_data, std::map<CKeyID
std::tie(range_start, range_end) = ParseDescriptorRange(data["range"]);
}
// Only single key descriptors are allowed to be imported to a legacy wallet's keypool
bool can_keypool = parsed_descs.at(0)->IsSingleKey();
const UniValue& priv_keys = data.exists("keys") ? data["keys"].get_array() : UniValue();
for (size_t j = 0; j < parsed_descs.size(); ++j) {
@ -1107,8 +1110,10 @@ static UniValue ProcessImportDescriptor(ImportData& import_data, std::map<CKeyID
std::vector<CScript> scripts_temp;
parsed_desc->Expand(i, keys, scripts_temp, out_keys);
std::copy(scripts_temp.begin(), scripts_temp.end(), std::inserter(script_pub_keys, script_pub_keys.end()));
for (const auto& key_pair : out_keys.pubkeys) {
ordered_pubkeys.emplace_back(key_pair.first, desc_internal);
if (can_keypool) {
for (const auto& key_pair : out_keys.pubkeys) {
ordered_pubkeys.emplace_back(key_pair.first, desc_internal);
}
}
for (const auto& x : out_keys.scripts) {
@ -1575,16 +1580,15 @@ static UniValue ProcessDescriptorImport(CWallet& wallet, const UniValue& data, c
WalletDescriptor w_desc(std::move(parsed_desc), timestamp, range_start, range_end, next_index);
// Check if the wallet already contains the descriptor
auto existing_spk_manager = wallet.GetDescriptorScriptPubKeyMan(w_desc);
if (existing_spk_manager) {
if (!existing_spk_manager->CanUpdateToWalletDescriptor(w_desc, error)) {
throw JSONRPCError(RPC_INVALID_PARAMETER, error);
}
// Add descriptor to the wallet
auto spk_manager_res = wallet.AddWalletDescriptor(w_desc, keys, label, desc_internal);
if (!spk_manager_res) {
throw JSONRPCError(RPC_INVALID_PARAMETER, util::ErrorString(spk_manager_res).original);
}
// Add descriptor to the wallet
auto spk_manager = wallet.AddWalletDescriptor(w_desc, keys, label, desc_internal);
auto spk_manager = spk_manager_res.value();
if (spk_manager == nullptr) {
throw JSONRPCError(RPC_WALLET_ERROR, strprintf("Could not add descriptor '%s'", descriptor));
}

View file

@ -995,9 +995,9 @@ static RPCHelpMan bumpfee_helper(std::string method_name)
const std::string incremental_fee{CFeeRate(DEFAULT_INCREMENTAL_RELAY_FEE).ToString(FeeEstimateMode::SAT_VB)};
return RPCHelpMan{method_name,
"\nBumps the fee of an opt-in-RBF transaction T, replacing it with a new transaction B.\n"
"Bumps the fee of a transaction T, replacing it with a new transaction B.\n"
+ std::string(want_psbt ? "Returns a PSBT instead of creating and signing a new transaction.\n" : "") +
"An opt-in RBF transaction with the given txid must be in the wallet.\n"
"A transaction with the given txid must be in the wallet.\n"
"The command will pay the additional fee by reducing change outputs or adding inputs when necessary.\n"
"It may add a new change output if one does not already exist.\n"
"All inputs in the original transaction will be included in the replacement transaction.\n"
@ -1017,10 +1017,11 @@ static RPCHelpMan bumpfee_helper(std::string method_name)
"\nSpecify a fee rate in " + CURRENCY_ATOM + "/vB instead of relying on the built-in fee estimator.\n"
"Must be at least " + incremental_fee + " higher than the current transaction fee rate.\n"
"WARNING: before version 0.21, fee_rate was in " + CURRENCY_UNIT + "/kvB. As of 0.21, fee_rate is in " + CURRENCY_ATOM + "/vB.\n"},
{"replaceable", RPCArg::Type::BOOL, RPCArg::Default{true}, "Whether the new transaction should still be\n"
{"replaceable", RPCArg::Type::BOOL, RPCArg::Default{true},
"Whether the new transaction should be\n"
"marked bip-125 replaceable. If true, the sequence numbers in the transaction will\n"
"be left unchanged from the original. If false, any input sequence numbers in the\n"
"original transaction that were less than 0xfffffffe will be increased to 0xfffffffe\n"
"be set to 0xfffffffd. If false, any input sequence numbers in the\n"
"transaction will be set to 0xfffffffe\n"
"so the new transaction will not be explicitly bip-125 replaceable (though it may\n"
"still be replaceable in practice, for example if it has unconfirmed ancestors which\n"
"are replaceable).\n"},

View file

@ -355,9 +355,7 @@ static RPCHelpMan createwallet()
{"blank", RPCArg::Type::BOOL, RPCArg::Default{false}, "Create a blank wallet. A blank wallet has no keys or HD seed. One can be set using sethdseed."},
{"passphrase", RPCArg::Type::STR, RPCArg::Optional::OMITTED, "Encrypt the wallet with this passphrase."},
{"avoid_reuse", RPCArg::Type::BOOL, RPCArg::Default{false}, "Keep track of coin reuse, and treat dirty and clean coins differently with privacy considerations in mind."},
{"descriptors", RPCArg::Type::BOOL, RPCArg::Default{true}, "Create a native descriptor wallet. The wallet will use descriptors internally to handle address creation."
" Setting to \"false\" will create a legacy wallet; This is only possible with the -deprecatedrpc=create_bdb setting because, the legacy wallet type is being deprecated and"
" support for creating and opening legacy wallets will be removed in the future."},
{"descriptors", RPCArg::Type::BOOL, RPCArg::Default{true}, "If set, must be \"true\""},
{"load_on_startup", RPCArg::Type::BOOL, RPCArg::Optional::OMITTED, "Save wallet name to persistent settings and load on startup. True to add wallet to startup list, false to remove, null to leave unchanged."},
{"external_signer", RPCArg::Type::BOOL, RPCArg::Default{false}, "Use an external signer such as a hardware wallet. Requires -signer to be configured. Wallet creation will fail if keys cannot be fetched. Requires disable_private_keys and descriptors set to true."},
},
@ -402,13 +400,9 @@ static RPCHelpMan createwallet()
if (!request.params[4].isNull() && request.params[4].get_bool()) {
flags |= WALLET_FLAG_AVOID_REUSE;
}
if (self.Arg<bool>("descriptors")) {
flags |= WALLET_FLAG_DESCRIPTORS;
} else {
if (!context.chain->rpcEnableDeprecated("create_bdb")) {
throw JSONRPCError(RPC_WALLET_ERROR, "BDB wallet creation is deprecated and will be removed in a future release."
" In this release it can be re-enabled temporarily with the -deprecatedrpc=create_bdb setting.");
}
flags |= WALLET_FLAG_DESCRIPTORS;
if (!self.Arg<bool>("descriptors")) {
throw JSONRPCError(RPC_WALLET_ERROR, "descriptors argument must be set to \"true\"; it is no longer possible to create a legacy wallet.");
}
if (!request.params[7].isNull() && request.params[7].get_bool()) {
#ifdef ENABLE_EXTERNAL_SIGNER
@ -418,12 +412,6 @@ static RPCHelpMan createwallet()
#endif
}
#ifndef USE_BDB
if (!(flags & WALLET_FLAG_DESCRIPTORS)) {
throw JSONRPCError(RPC_WALLET_ERROR, "Compiled without bdb support (required for legacy wallets)");
}
#endif
DatabaseOptions options;
DatabaseStatus status;
ReadDatabaseArgs(*context.args, options);

View file

@ -1,221 +0,0 @@
// Copyright (c) 2009-2010 Satoshi Nakamoto
// Copyright (c) 2009-present The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#include <streams.h>
#include <util/fs.h>
#include <util/translation.h>
#include <wallet/bdb.h>
#include <wallet/salvage.h>
#include <wallet/wallet.h>
#include <wallet/walletdb.h>
#include <db_cxx.h>
namespace wallet {
/* End of headers, beginning of key/value data */
static const char *HEADER_END = "HEADER=END";
/* End of key/value data */
static const char *DATA_END = "DATA=END";
typedef std::pair<std::vector<unsigned char>, std::vector<unsigned char> > KeyValPair;
class DummyCursor : public DatabaseCursor
{
Status Next(DataStream& key, DataStream& value) override { return Status::FAIL; }
};
/** RAII class that provides access to a DummyDatabase. Never fails. */
class DummyBatch : public DatabaseBatch
{
private:
bool ReadKey(DataStream&& key, DataStream& value) override { return true; }
bool WriteKey(DataStream&& key, DataStream&& value, bool overwrite=true) override { return true; }
bool EraseKey(DataStream&& key) override { return true; }
bool HasKey(DataStream&& key) override { return true; }
bool ErasePrefix(std::span<const std::byte> prefix) override { return true; }
public:
void Flush() override {}
void Close() override {}
std::unique_ptr<DatabaseCursor> GetNewCursor() override { return std::make_unique<DummyCursor>(); }
std::unique_ptr<DatabaseCursor> GetNewPrefixCursor(std::span<const std::byte> prefix) override { return GetNewCursor(); }
bool TxnBegin() override { return true; }
bool TxnCommit() override { return true; }
bool TxnAbort() override { return true; }
bool HasActiveTxn() override { return false; }
};
/** A dummy WalletDatabase that does nothing and never fails. Only used by salvage.
**/
class DummyDatabase : public WalletDatabase
{
public:
void Open() override {};
void AddRef() override {}
void RemoveRef() override {}
bool Rewrite(const char* pszSkip=nullptr) override { return true; }
bool Backup(const std::string& strDest) const override { return true; }
void Close() override {}
void Flush() override {}
bool PeriodicFlush() override { return true; }
void IncrementUpdateCounter() override { ++nUpdateCounter; }
void ReloadDbEnv() override {}
std::string Filename() override { return "dummy"; }
std::string Format() override { return "dummy"; }
std::unique_ptr<DatabaseBatch> MakeBatch(bool flush_on_close = true) override { return std::make_unique<DummyBatch>(); }
};
bool RecoverDatabaseFile(const ArgsManager& args, const fs::path& file_path, bilingual_str& error, std::vector<bilingual_str>& warnings)
{
DatabaseOptions options;
DatabaseStatus status;
ReadDatabaseArgs(args, options);
options.require_existing = true;
options.verify = false;
options.require_format = DatabaseFormat::BERKELEY;
std::unique_ptr<WalletDatabase> database = MakeDatabase(file_path, options, status, error);
if (!database) return false;
BerkeleyDatabase& berkeley_database = static_cast<BerkeleyDatabase&>(*database);
std::string filename = berkeley_database.Filename();
std::shared_ptr<BerkeleyEnvironment> env = berkeley_database.env;
if (!env->Open(error)) {
return false;
}
// Recovery procedure:
// move wallet file to walletfilename.timestamp.bak
// Call Salvage with fAggressive=true to
// get as much data as possible.
// Rewrite salvaged data to fresh wallet file
// Rescan so any missing transactions will be
// found.
int64_t now = GetTime();
std::string newFilename = strprintf("%s.%d.bak", filename, now);
int result = env->dbenv->dbrename(nullptr, filename.c_str(), nullptr,
newFilename.c_str(), DB_AUTO_COMMIT);
if (result != 0)
{
error = Untranslated(strprintf("Failed to rename %s to %s", filename, newFilename));
return false;
}
/**
* Salvage data from a file. The DB_AGGRESSIVE flag is being used (see berkeley DB->verify() method documentation).
* key/value pairs are appended to salvagedData which are then written out to a new wallet file.
* NOTE: reads the entire database into memory, so cannot be used
* for huge databases.
*/
std::vector<KeyValPair> salvagedData;
std::stringstream strDump;
Db db(env->dbenv.get(), 0);
result = db.verify(newFilename.c_str(), nullptr, &strDump, DB_SALVAGE | DB_AGGRESSIVE);
if (result == DB_VERIFY_BAD) {
warnings.emplace_back(Untranslated("Salvage: Database salvage found errors, all data may not be recoverable."));
}
if (result != 0 && result != DB_VERIFY_BAD) {
error = Untranslated(strprintf("Salvage: Database salvage failed with result %d.", result));
return false;
}
// Format of bdb dump is ascii lines:
// header lines...
// HEADER=END
// hexadecimal key
// hexadecimal value
// ... repeated
// DATA=END
std::string strLine;
while (!strDump.eof() && strLine != HEADER_END)
getline(strDump, strLine); // Skip past header
std::string keyHex, valueHex;
while (!strDump.eof() && keyHex != DATA_END) {
getline(strDump, keyHex);
if (keyHex != DATA_END) {
if (strDump.eof())
break;
getline(strDump, valueHex);
if (valueHex == DATA_END) {
warnings.emplace_back(Untranslated("Salvage: WARNING: Number of keys in data does not match number of values."));
break;
}
salvagedData.emplace_back(ParseHex(keyHex), ParseHex(valueHex));
}
}
bool fSuccess;
if (keyHex != DATA_END) {
warnings.emplace_back(Untranslated("Salvage: WARNING: Unexpected end of file while reading salvage output."));
fSuccess = false;
} else {
fSuccess = (result == 0);
}
if (salvagedData.empty())
{
error = Untranslated(strprintf("Salvage(aggressive) found no records in %s.", newFilename));
return false;
}
std::unique_ptr<Db> pdbCopy = std::make_unique<Db>(env->dbenv.get(), 0);
int ret = pdbCopy->open(nullptr, // Txn pointer
filename.c_str(), // Filename
"main", // Logical db name
DB_BTREE, // Database type
DB_CREATE, // Flags
0);
if (ret > 0) {
error = Untranslated(strprintf("Cannot create database file %s", filename));
pdbCopy->close(0);
return false;
}
DbTxn* ptxn = env->TxnBegin(DB_TXN_WRITE_NOSYNC);
CWallet dummyWallet(nullptr, "", std::make_unique<DummyDatabase>());
for (KeyValPair& row : salvagedData)
{
/* Filter for only private key type KV pairs to be added to the salvaged wallet */
DataStream ssKey{row.first};
DataStream ssValue(row.second);
std::string strType, strErr;
// We only care about KEY, MASTER_KEY, CRYPTED_KEY, and HDCHAIN types
ssKey >> strType;
bool fReadOK = false;
if (strType == DBKeys::KEY) {
fReadOK = LoadKey(&dummyWallet, ssKey, ssValue, strErr);
} else if (strType == DBKeys::CRYPTED_KEY) {
fReadOK = LoadCryptedKey(&dummyWallet, ssKey, ssValue, strErr);
} else if (strType == DBKeys::MASTER_KEY) {
fReadOK = LoadEncryptionKey(&dummyWallet, ssKey, ssValue, strErr);
} else if (strType == DBKeys::HDCHAIN) {
fReadOK = LoadHDChain(&dummyWallet, ssValue, strErr);
} else {
continue;
}
if (!fReadOK)
{
warnings.push_back(Untranslated(strprintf("WARNING: WalletBatch::Recover skipping %s: %s", strType, strErr)));
continue;
}
Dbt datKey(row.first.data(), row.first.size());
Dbt datValue(row.second.data(), row.second.size());
int ret2 = pdbCopy->put(ptxn, &datKey, &datValue, DB_NOOVERWRITE);
if (ret2 > 0)
fSuccess = false;
}
ptxn->commit(0);
pdbCopy->close(0);
return fSuccess;
}
} // namespace wallet

View file

@ -1,19 +0,0 @@
// Copyright (c) 2009-2010 Satoshi Nakamoto
// Copyright (c) 2009-2021 The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#ifndef BITCOIN_WALLET_SALVAGE_H
#define BITCOIN_WALLET_SALVAGE_H
#include <streams.h>
#include <util/fs.h>
class ArgsManager;
struct bilingual_str;
namespace wallet {
bool RecoverDatabaseFile(const ArgsManager& args, const fs::path& file_path, bilingual_str& error, std::vector<bilingual_str>& warnings);
} // namespace wallet
#endif // BITCOIN_WALLET_SALVAGE_H

View file

@ -2827,12 +2827,12 @@ void DescriptorScriptPubKeyMan::UpgradeDescriptorCache()
}
}
void DescriptorScriptPubKeyMan::UpdateWalletDescriptor(WalletDescriptor& descriptor)
util::Result<void> DescriptorScriptPubKeyMan::UpdateWalletDescriptor(WalletDescriptor& descriptor)
{
LOCK(cs_desc_man);
std::string error;
if (!CanUpdateToWalletDescriptor(descriptor, error)) {
throw std::runtime_error(std::string(__func__) + ": " + error);
return util::Error{Untranslated(std::move(error))};
}
m_map_pubkeys.clear();
@ -2841,6 +2841,7 @@ void DescriptorScriptPubKeyMan::UpdateWalletDescriptor(WalletDescriptor& descrip
m_wallet_descriptor = descriptor;
NotifyFirstKeyTimeChanged(this, m_wallet_descriptor.creation_time);
return {};
}
bool DescriptorScriptPubKeyMan::CanUpdateToWalletDescriptor(const WalletDescriptor& descriptor, std::string& error)

View file

@ -694,7 +694,7 @@ public:
bool AddCryptedKey(const CKeyID& key_id, const CPubKey& pubkey, const std::vector<unsigned char>& crypted_key);
bool HasWalletDescriptor(const WalletDescriptor& desc) const;
void UpdateWalletDescriptor(WalletDescriptor& descriptor);
util::Result<void> UpdateWalletDescriptor(WalletDescriptor& descriptor);
bool CanUpdateToWalletDescriptor(const WalletDescriptor& descriptor, std::string& error);
void AddDescriptorKey(const CKey& key, const CPubKey &pubkey);
void WriteDescriptor();

View file

@ -2,17 +2,12 @@
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#include <bitcoin-build-config.h> // IWYU pragma: keep
#include <boost/test/unit_test.hpp>
#include <test/util/setup_common.h>
#include <util/check.h>
#include <util/fs.h>
#include <util/translation.h>
#ifdef USE_BDB
#include <wallet/bdb.h>
#endif
#include <wallet/sqlite.h>
#include <wallet/migrate.h>
#include <wallet/test/util.h>
@ -60,82 +55,13 @@ static void CheckPrefix(DatabaseBatch& batch, std::span<const std::byte> prefix,
BOOST_FIXTURE_TEST_SUITE(db_tests, BasicTestingSetup)
#ifdef USE_BDB
static std::shared_ptr<BerkeleyEnvironment> GetWalletEnv(const fs::path& path, fs::path& database_filename)
{
fs::path data_file = BDBDataFile(path);
database_filename = data_file.filename();
return GetBerkeleyEnv(data_file.parent_path(), false);
}
BOOST_AUTO_TEST_CASE(getwalletenv_file)
{
fs::path test_name = "test_name.dat";
const fs::path datadir = m_args.GetDataDirNet();
fs::path file_path = datadir / test_name;
std::ofstream f{file_path};
f.close();
fs::path filename;
std::shared_ptr<BerkeleyEnvironment> env = GetWalletEnv(file_path, filename);
BOOST_CHECK_EQUAL(filename, test_name);
BOOST_CHECK_EQUAL(env->Directory(), datadir);
}
BOOST_AUTO_TEST_CASE(getwalletenv_directory)
{
fs::path expected_name = "wallet.dat";
const fs::path datadir = m_args.GetDataDirNet();
fs::path filename;
std::shared_ptr<BerkeleyEnvironment> env = GetWalletEnv(datadir, filename);
BOOST_CHECK_EQUAL(filename, expected_name);
BOOST_CHECK_EQUAL(env->Directory(), datadir);
}
BOOST_AUTO_TEST_CASE(getwalletenv_g_dbenvs_multiple)
{
fs::path datadir = m_args.GetDataDirNet() / "1";
fs::path datadir_2 = m_args.GetDataDirNet() / "2";
fs::path filename;
std::shared_ptr<BerkeleyEnvironment> env_1 = GetWalletEnv(datadir, filename);
std::shared_ptr<BerkeleyEnvironment> env_2 = GetWalletEnv(datadir, filename);
std::shared_ptr<BerkeleyEnvironment> env_3 = GetWalletEnv(datadir_2, filename);
BOOST_CHECK(env_1 == env_2);
BOOST_CHECK(env_2 != env_3);
}
BOOST_AUTO_TEST_CASE(getwalletenv_g_dbenvs_free_instance)
{
fs::path datadir = gArgs.GetDataDirNet() / "1";
fs::path datadir_2 = gArgs.GetDataDirNet() / "2";
fs::path filename;
std::shared_ptr <BerkeleyEnvironment> env_1_a = GetWalletEnv(datadir, filename);
std::shared_ptr <BerkeleyEnvironment> env_2_a = GetWalletEnv(datadir_2, filename);
env_1_a.reset();
std::shared_ptr<BerkeleyEnvironment> env_1_b = GetWalletEnv(datadir, filename);
std::shared_ptr<BerkeleyEnvironment> env_2_b = GetWalletEnv(datadir_2, filename);
BOOST_CHECK(env_1_a != env_1_b);
BOOST_CHECK(env_2_a == env_2_b);
}
#endif
static std::vector<std::unique_ptr<WalletDatabase>> TestDatabases(const fs::path& path_root)
{
std::vector<std::unique_ptr<WalletDatabase>> dbs;
DatabaseOptions options;
DatabaseStatus status;
bilingual_str error;
#ifdef USE_BDB
dbs.emplace_back(MakeBerkeleyDatabase(path_root / "bdb", options, status, error));
// Needs BDB to make the DB to read
dbs.emplace_back(std::make_unique<BerkeleyRODatabase>(BDBDataFile(path_root / "bdb"), /*open=*/false));
#endif
// Unable to test BerkeleyRO since we cannot create a new BDB database to open
dbs.emplace_back(MakeSQLiteDatabase(path_root / "sqlite", options, status, error));
dbs.emplace_back(CreateMockableWalletDatabase());
return dbs;
@ -148,15 +74,10 @@ BOOST_AUTO_TEST_CASE(db_cursor_prefix_range_test)
std::vector<std::string> prefixes = {"", "FIRST", "SECOND", "P\xfe\xff", "P\xff\x01", "\xff\xff"};
std::unique_ptr<DatabaseBatch> handler = Assert(database)->MakeBatch();
if (dynamic_cast<BerkeleyRODatabase*>(database.get())) {
// For BerkeleyRO, open the file now. This must happen after BDB has written to the file
database->Open();
} else {
// Write elements to it if not berkeleyro
for (unsigned int i = 0; i < 10; i++) {
for (const auto& prefix : prefixes) {
BOOST_CHECK(handler->Write(std::make_pair(prefix, i), i));
}
// Write elements to it
for (unsigned int i = 0; i < 10; i++) {
for (const auto& prefix : prefixes) {
BOOST_CHECK(handler->Write(std::make_pair(prefix, i), i));
}
}
@ -206,14 +127,9 @@ BOOST_AUTO_TEST_CASE(db_cursor_prefix_byte_test)
for (const auto& database : TestDatabases(m_path_root)) {
std::unique_ptr<DatabaseBatch> batch = database->MakeBatch();
if (dynamic_cast<BerkeleyRODatabase*>(database.get())) {
// For BerkeleyRO, open the file now. This must happen after BDB has written to the file
database->Open();
} else {
// Write elements to it if not berkeleyro
for (const auto& [k, v] : {e, p, ps, f, fs, ff, ffs}) {
batch->Write(std::span{k}, std::span{v});
}
// Write elements to it if not berkeleyro
for (const auto& [k, v] : {e, p, ps, f, fs, ff, ffs}) {
batch->Write(std::span{k}, std::span{v});
}
CheckPrefix(*batch, StringBytes(""), {e, p, ps, f, fs, ff, ffs});
@ -231,10 +147,6 @@ BOOST_AUTO_TEST_CASE(db_availability_after_write_error)
// To simulate the behavior, record overwrites are disallowed, and the test verifies
// that the database remains active after failing to store an existing record.
for (const auto& database : TestDatabases(m_path_root)) {
if (dynamic_cast<BerkeleyRODatabase*>(database.get())) {
// Skip this test if BerkeleyRO
continue;
}
// Write original record
std::unique_ptr<DatabaseBatch> batch = database->MakeBatch();
std::string key = "key";

View file

@ -80,8 +80,12 @@ static std::optional<std::pair<WalletDescriptor, FlatSigningProvider>> CreateWal
static DescriptorScriptPubKeyMan* CreateDescriptor(WalletDescriptor& wallet_desc, FlatSigningProvider& keys, CWallet& keystore)
{
LOCK(keystore.cs_wallet);
keystore.AddWalletDescriptor(wallet_desc, keys, /*label=*/"", /*internal=*/false);
return keystore.GetDescriptorScriptPubKeyMan(wallet_desc);
DescriptorScriptPubKeyMan* descriptor_spk_manager = nullptr;
auto spk_manager = *Assert(keystore.AddWalletDescriptor(wallet_desc, keys, /*label=*/"", /*internal=*/false));
if (spk_manager) {
descriptor_spk_manager = dynamic_cast<DescriptorScriptPubKeyMan*>(spk_manager);
}
return descriptor_spk_manager;
};
FUZZ_TARGET(scriptpubkeyman, .init = initialize_spkm)

View file

@ -27,7 +27,8 @@ static void import_descriptor(CWallet& wallet, const std::string& descriptor)
assert(descs.size() == 1);
auto& desc = descs.at(0);
WalletDescriptor w_desc(std::move(desc), 0, 0, 10, 0);
wallet.AddWalletDescriptor(w_desc, provider, "", false);
auto spk_manager = *Assert(wallet.AddWalletDescriptor(w_desc, provider, "", false));
assert(spk_manager);
}
BOOST_AUTO_TEST_CASE(psbt_updater_test)

View file

@ -35,7 +35,8 @@ std::unique_ptr<CWallet> CreateSyncedWallet(interfaces::Chain& chain, CChain& cc
assert(descs.size() == 1);
auto& desc = descs.at(0);
WalletDescriptor w_desc(std::move(desc), 0, 0, 1, 1);
if (!wallet->AddWalletDescriptor(w_desc, provider, "", false)) assert(false);
auto spk_manager = *Assert(wallet->AddWalletDescriptor(w_desc, provider, "", false));
assert(spk_manager);
}
WalletRescanReserver reserver(*wallet);
reserver.reserve();
@ -209,7 +210,7 @@ wallet::ScriptPubKeyMan* CreateDescriptor(CWallet& keystore, const std::string&
WalletDescriptor w_desc(std::move(desc), timestamp, range_start, range_end, next_index);
LOCK(keystore.cs_wallet);
return Assert(keystore.AddWalletDescriptor(w_desc, keys,/*label=*/"", /*internal=*/false));
auto spkm = Assert(keystore.AddWalletDescriptor(w_desc, keys,/*label=*/"", /*internal=*/false));
return spkm.value();
};
} // namespace wallet

View file

@ -5,8 +5,6 @@
#ifndef BITCOIN_WALLET_TEST_UTIL_H
#define BITCOIN_WALLET_TEST_UTIL_H
#include <bitcoin-build-config.h> // IWYU pragma: keep
#include <addresstype.h>
#include <wallet/db.h>
#include <wallet/scriptpubkeyman.h>
@ -28,9 +26,6 @@ struct WalletContext;
static const DatabaseFormat DATABASE_FORMATS[] = {
DatabaseFormat::SQLITE,
#ifdef USE_BDB
DatabaseFormat::BERKELEY,
#endif
};
const std::string ADDRESS_BCRT1_UNSPENDABLE = "bcrt1qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq3xueyj";

View file

@ -69,7 +69,8 @@ static void AddKey(CWallet& wallet, const CKey& key)
assert(descs.size() == 1);
auto& desc = descs.at(0);
WalletDescriptor w_desc(std::move(desc), 0, 0, 1, 1);
if (!wallet.AddWalletDescriptor(w_desc, provider, "", false)) assert(false);
auto spk_manager = *Assert(wallet.AddWalletDescriptor(w_desc, provider, "", false));
assert(spk_manager);
}
BOOST_FIXTURE_TEST_CASE(scan_for_wallet_transactions, TestChain100Setup)
@ -195,141 +196,6 @@ BOOST_FIXTURE_TEST_CASE(scan_for_wallet_transactions, TestChain100Setup)
}
}
BOOST_FIXTURE_TEST_CASE(importmulti_rescan, TestChain100Setup)
{
// Cap last block file size, and mine new block in a new block file.
CBlockIndex* oldTip = WITH_LOCK(Assert(m_node.chainman)->GetMutex(), return m_node.chainman->ActiveChain().Tip());
WITH_LOCK(::cs_main, m_node.chainman->m_blockman.GetBlockFileInfo(oldTip->GetBlockPos().nFile)->nSize = MAX_BLOCKFILE_SIZE);
CreateAndProcessBlock({}, GetScriptForRawPubKey(coinbaseKey.GetPubKey()));
CBlockIndex* newTip = WITH_LOCK(Assert(m_node.chainman)->GetMutex(), return m_node.chainman->ActiveChain().Tip());
// Prune the older block file.
int file_number;
{
LOCK(cs_main);
file_number = oldTip->GetBlockPos().nFile;
Assert(m_node.chainman)->m_blockman.PruneOneBlockFile(file_number);
}
m_node.chainman->m_blockman.UnlinkPrunedFiles({file_number});
// Verify importmulti RPC returns failure for a key whose creation time is
// before the missing block, and success for a key whose creation time is
// after.
{
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
wallet->SetupLegacyScriptPubKeyMan();
WITH_LOCK(wallet->cs_wallet, wallet->SetLastBlockProcessed(newTip->nHeight, newTip->GetBlockHash()));
WalletContext context;
context.args = &m_args;
AddWallet(context, wallet);
UniValue keys;
keys.setArray();
UniValue key;
key.setObject();
key.pushKV("scriptPubKey", HexStr(GetScriptForRawPubKey(coinbaseKey.GetPubKey())));
key.pushKV("timestamp", 0);
key.pushKV("internal", UniValue(true));
keys.push_back(key);
key.clear();
key.setObject();
CKey futureKey = GenerateRandomKey();
key.pushKV("scriptPubKey", HexStr(GetScriptForRawPubKey(futureKey.GetPubKey())));
key.pushKV("timestamp", newTip->GetBlockTimeMax() + TIMESTAMP_WINDOW + 1);
key.pushKV("internal", UniValue(true));
keys.push_back(std::move(key));
JSONRPCRequest request;
request.context = &context;
request.params.setArray();
request.params.push_back(std::move(keys));
UniValue response = importmulti().HandleRequest(request);
BOOST_CHECK_EQUAL(response.write(),
strprintf("[{\"success\":false,\"error\":{\"code\":-1,\"message\":\"Rescan failed for key with creation "
"timestamp %d. There was an error reading a block from time %d, which is after or within %d "
"seconds of key creation, and could contain transactions pertaining to the key. As a result, "
"transactions and coins using this key may not appear in the wallet. This error could be caused "
"by pruning or data corruption (see bitcoind log for details) and could be dealt with by "
"downloading and rescanning the relevant blocks (see -reindex option and rescanblockchain "
"RPC).\"}},{\"success\":true}]",
0, oldTip->GetBlockTimeMax(), TIMESTAMP_WINDOW));
RemoveWallet(context, wallet, /* load_on_start= */ std::nullopt);
}
}
// Verify importwallet RPC starts rescan at earliest block with timestamp
// greater or equal than key birthday. Previously there was a bug where
// importwallet RPC would start the scan at the latest block with timestamp less
// than or equal to key birthday.
BOOST_FIXTURE_TEST_CASE(importwallet_rescan, TestChain100Setup)
{
// Create two blocks with same timestamp to verify that importwallet rescan
// will pick up both blocks, not just the first.
const int64_t BLOCK_TIME = WITH_LOCK(Assert(m_node.chainman)->GetMutex(), return m_node.chainman->ActiveChain().Tip()->GetBlockTimeMax() + 5);
SetMockTime(BLOCK_TIME);
m_coinbase_txns.emplace_back(CreateAndProcessBlock({}, GetScriptForRawPubKey(coinbaseKey.GetPubKey())).vtx[0]);
m_coinbase_txns.emplace_back(CreateAndProcessBlock({}, GetScriptForRawPubKey(coinbaseKey.GetPubKey())).vtx[0]);
// Set key birthday to block time increased by the timestamp window, so
// rescan will start at the block time.
const int64_t KEY_TIME = BLOCK_TIME + TIMESTAMP_WINDOW;
SetMockTime(KEY_TIME);
m_coinbase_txns.emplace_back(CreateAndProcessBlock({}, GetScriptForRawPubKey(coinbaseKey.GetPubKey())).vtx[0]);
std::string backup_file = fs::PathToString(m_args.GetDataDirNet() / "wallet.backup");
// Import key into wallet and call dumpwallet to create backup file.
{
WalletContext context;
context.args = &m_args;
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
{
auto spk_man = wallet->GetOrCreateLegacyScriptPubKeyMan();
LOCK2(wallet->cs_wallet, spk_man->cs_KeyStore);
spk_man->mapKeyMetadata[coinbaseKey.GetPubKey().GetID()].nCreateTime = KEY_TIME;
spk_man->AddKeyPubKey(coinbaseKey, coinbaseKey.GetPubKey());
AddWallet(context, wallet);
LOCK(Assert(m_node.chainman)->GetMutex());
wallet->SetLastBlockProcessed(m_node.chainman->ActiveChain().Height(), m_node.chainman->ActiveChain().Tip()->GetBlockHash());
}
JSONRPCRequest request;
request.context = &context;
request.params.setArray();
request.params.push_back(backup_file);
wallet::dumpwallet().HandleRequest(request);
RemoveWallet(context, wallet, /* load_on_start= */ std::nullopt);
}
// Call importwallet RPC and verify all blocks with timestamps >= BLOCK_TIME
// were scanned, and no prior blocks were scanned.
{
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
LOCK(wallet->cs_wallet);
wallet->SetupLegacyScriptPubKeyMan();
WalletContext context;
context.args = &m_args;
JSONRPCRequest request;
request.context = &context;
request.params.setArray();
request.params.push_back(backup_file);
AddWallet(context, wallet);
LOCK(Assert(m_node.chainman)->GetMutex());
wallet->SetLastBlockProcessed(m_node.chainman->ActiveChain().Height(), m_node.chainman->ActiveChain().Tip()->GetBlockHash());
wallet::importwallet().HandleRequest(request);
RemoveWallet(context, wallet, /* load_on_start= */ std::nullopt);
BOOST_CHECK_EQUAL(wallet->mapWallet.size(), 3U);
BOOST_CHECK_EQUAL(m_coinbase_txns.size(), 103U);
for (size_t i = 0; i < m_coinbase_txns.size(); ++i) {
bool found = wallet->GetWalletTx(m_coinbase_txns[i]->GetHash());
bool expected = i >= 100;
BOOST_CHECK_EQUAL(found, expected);
}
}
}
// This test verifies that wallet settings can be added and removed
// concurrently, ensuring no race conditions occur during either process.
BOOST_FIXTURE_TEST_CASE(write_wallet_settings_concurrently, TestingSetup)
@ -495,87 +361,6 @@ BOOST_FIXTURE_TEST_CASE(LoadReceiveRequests, TestingSetup)
}
}
// Test some watch-only LegacyScriptPubKeyMan methods by the procedure of loading (LoadWatchOnly),
// checking (HaveWatchOnly), getting (GetWatchPubKey) and removing (RemoveWatchOnly) a
// given PubKey, resp. its corresponding P2PK Script. Results of the impact on
// the address -> PubKey map is dependent on whether the PubKey is a point on the curve
static void TestWatchOnlyPubKey(LegacyScriptPubKeyMan* spk_man, const CPubKey& add_pubkey)
{
CScript p2pk = GetScriptForRawPubKey(add_pubkey);
CKeyID add_address = add_pubkey.GetID();
CPubKey found_pubkey;
LOCK(spk_man->cs_KeyStore);
// all Scripts (i.e. also all PubKeys) are added to the general watch-only set
BOOST_CHECK(!spk_man->HaveWatchOnly(p2pk));
spk_man->LoadWatchOnly(p2pk);
BOOST_CHECK(spk_man->HaveWatchOnly(p2pk));
// only PubKeys on the curve shall be added to the watch-only address -> PubKey map
bool is_pubkey_fully_valid = add_pubkey.IsFullyValid();
if (is_pubkey_fully_valid) {
BOOST_CHECK(spk_man->GetWatchPubKey(add_address, found_pubkey));
BOOST_CHECK(found_pubkey == add_pubkey);
} else {
BOOST_CHECK(!spk_man->GetWatchPubKey(add_address, found_pubkey));
BOOST_CHECK(found_pubkey == CPubKey()); // passed key is unchanged
}
spk_man->RemoveWatchOnly(p2pk);
BOOST_CHECK(!spk_man->HaveWatchOnly(p2pk));
if (is_pubkey_fully_valid) {
BOOST_CHECK(!spk_man->GetWatchPubKey(add_address, found_pubkey));
BOOST_CHECK(found_pubkey == add_pubkey); // passed key is unchanged
}
}
// Cryptographically invalidate a PubKey whilst keeping length and first byte
static void PollutePubKey(CPubKey& pubkey)
{
assert(pubkey.size() >= 1);
std::vector<unsigned char> pubkey_raw;
pubkey_raw.push_back(pubkey[0]);
pubkey_raw.insert(pubkey_raw.end(), pubkey.size() - 1, 0);
pubkey = CPubKey(pubkey_raw);
assert(!pubkey.IsFullyValid());
assert(pubkey.IsValid());
}
// Test watch-only logic for PubKeys
BOOST_AUTO_TEST_CASE(WatchOnlyPubKeys)
{
CKey key;
CPubKey pubkey;
LegacyScriptPubKeyMan* spk_man = m_wallet.GetOrCreateLegacyScriptPubKeyMan();
BOOST_CHECK(!spk_man->HaveWatchOnly());
// uncompressed valid PubKey
key.MakeNewKey(false);
pubkey = key.GetPubKey();
assert(!pubkey.IsCompressed());
TestWatchOnlyPubKey(spk_man, pubkey);
// uncompressed cryptographically invalid PubKey
PollutePubKey(pubkey);
TestWatchOnlyPubKey(spk_man, pubkey);
// compressed valid PubKey
key.MakeNewKey(true);
pubkey = key.GetPubKey();
assert(pubkey.IsCompressed());
TestWatchOnlyPubKey(spk_man, pubkey);
// compressed cryptographically invalid PubKey
PollutePubKey(pubkey);
TestWatchOnlyPubKey(spk_man, pubkey);
// invalid empty PubKey
pubkey = CPubKey();
TestWatchOnlyPubKey(spk_man, pubkey);
}
class ListCoinsTestingSetup : public TestChain100Setup
{
public:
@ -718,22 +503,12 @@ BOOST_FIXTURE_TEST_CASE(BasicOutputTypesTest, ListCoinsTest)
BOOST_FIXTURE_TEST_CASE(wallet_disableprivkeys, TestChain100Setup)
{
{
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
wallet->SetupLegacyScriptPubKeyMan();
wallet->SetMinVersion(FEATURE_LATEST);
wallet->SetWalletFlag(WALLET_FLAG_DISABLE_PRIVATE_KEYS);
BOOST_CHECK(!wallet->TopUpKeyPool(1000));
BOOST_CHECK(!wallet->GetNewDestination(OutputType::BECH32, ""));
}
{
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
LOCK(wallet->cs_wallet);
wallet->SetWalletFlag(WALLET_FLAG_DESCRIPTORS);
wallet->SetMinVersion(FEATURE_LATEST);
wallet->SetWalletFlag(WALLET_FLAG_DISABLE_PRIVATE_KEYS);
BOOST_CHECK(!wallet->GetNewDestination(OutputType::BECH32, ""));
}
const std::shared_ptr<CWallet> wallet = std::make_shared<CWallet>(m_node.chain.get(), "", CreateMockableWalletDatabase());
LOCK(wallet->cs_wallet);
wallet->SetWalletFlag(WALLET_FLAG_DESCRIPTORS);
wallet->SetMinVersion(FEATURE_LATEST);
wallet->SetWalletFlag(WALLET_FLAG_DISABLE_PRIVATE_KEYS);
BOOST_CHECK(!wallet->GetNewDestination(OutputType::BECH32, ""));
}
// Explicit calculation which is used to test the wallet constant

View file

@ -26,6 +26,7 @@ public:
bool IsRange() const override { return false; }
bool IsSolvable() const override { return false; }
bool IsSingleType() const override { return true; }
bool IsSingleKey() const override { return true; }
bool ToPrivateString(const SigningProvider& provider, std::string& out) const override { return false; }
bool ToNormalizedString(const SigningProvider& provider, std::string& out, const DescriptorCache* cache = nullptr) const override { return false; }
bool Expand(int pos, const SigningProvider& provider, std::vector<CScript>& output_scripts, FlatSigningProvider& out, DescriptorCache* write_cache = nullptr) const override { return false; };
@ -83,132 +84,5 @@ BOOST_FIXTURE_TEST_CASE(wallet_load_descriptors, TestingSetup)
}
}
bool HasAnyRecordOfType(WalletDatabase& db, const std::string& key)
{
std::unique_ptr<DatabaseBatch> batch = db.MakeBatch(false);
BOOST_CHECK(batch);
std::unique_ptr<DatabaseCursor> cursor = batch->GetNewCursor();
BOOST_CHECK(cursor);
while (true) {
DataStream ssKey{};
DataStream ssValue{};
DatabaseCursor::Status status = cursor->Next(ssKey, ssValue);
assert(status != DatabaseCursor::Status::FAIL);
if (status == DatabaseCursor::Status::DONE) break;
std::string type;
ssKey >> type;
if (type == key) return true;
}
return false;
}
template<typename... Args>
SerializeData MakeSerializeData(const Args&... args)
{
DataStream s{};
SerializeMany(s, args...);
return {s.begin(), s.end()};
}
BOOST_FIXTURE_TEST_CASE(wallet_load_ckey, TestingSetup)
{
SerializeData ckey_record_key;
SerializeData ckey_record_value;
MockableData records;
{
// Context setup.
// Create and encrypt legacy wallet
std::shared_ptr<CWallet> wallet(new CWallet(m_node.chain.get(), "", CreateMockableWalletDatabase()));
LOCK(wallet->cs_wallet);
auto legacy_spkm = wallet->GetOrCreateLegacyScriptPubKeyMan();
BOOST_CHECK(legacy_spkm->SetupGeneration(true));
// Retrieve a key
CTxDestination dest = *Assert(legacy_spkm->GetNewDestination(OutputType::LEGACY));
CKeyID key_id = GetKeyForDestination(*legacy_spkm, dest);
CKey first_key;
BOOST_CHECK(legacy_spkm->GetKey(key_id, first_key));
// Encrypt the wallet
BOOST_CHECK(wallet->EncryptWallet("encrypt"));
wallet->Flush();
// Store a copy of all the records
records = GetMockableDatabase(*wallet).m_records;
// Get the record for the retrieved key
ckey_record_key = MakeSerializeData(DBKeys::CRYPTED_KEY, first_key.GetPubKey());
ckey_record_value = records.at(ckey_record_key);
}
{
// First test case:
// Erase all the crypted keys from db and unlock the wallet.
// The wallet will only re-write the crypted keys to db if any checksum is missing at load time.
// So, if any 'ckey' record re-appears on db, then the checksums were not properly calculated, and we are re-writing
// the records every time that 'CWallet::Unlock' gets called, which is not good.
// Load the wallet and check that is encrypted
std::shared_ptr<CWallet> wallet(new CWallet(m_node.chain.get(), "", CreateMockableWalletDatabase(records)));
BOOST_CHECK_EQUAL(wallet->LoadWallet(), DBErrors::LOAD_OK);
BOOST_CHECK(wallet->IsCrypted());
BOOST_CHECK(HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
// Now delete all records and check that the 'Unlock' function doesn't re-write them
BOOST_CHECK(wallet->GetLegacyScriptPubKeyMan()->DeleteRecords());
BOOST_CHECK(!HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
BOOST_CHECK(wallet->Unlock("encrypt"));
BOOST_CHECK(!HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
}
{
// Second test case:
// Verify that loading up a 'ckey' with no checksum triggers a complete re-write of the crypted keys.
// Cut off the 32 byte checksum from a ckey record
records[ckey_record_key].resize(ckey_record_value.size() - 32);
// Load the wallet and check that is encrypted
std::shared_ptr<CWallet> wallet(new CWallet(m_node.chain.get(), "", CreateMockableWalletDatabase(records)));
BOOST_CHECK_EQUAL(wallet->LoadWallet(), DBErrors::LOAD_OK);
BOOST_CHECK(wallet->IsCrypted());
BOOST_CHECK(HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
// Now delete all ckey records and check that the 'Unlock' function re-writes them
// (this is because the wallet, at load time, found a ckey record with no checksum)
BOOST_CHECK(wallet->GetLegacyScriptPubKeyMan()->DeleteRecords());
BOOST_CHECK(!HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
BOOST_CHECK(wallet->Unlock("encrypt"));
BOOST_CHECK(HasAnyRecordOfType(wallet->GetDatabase(), DBKeys::CRYPTED_KEY));
}
{
// Third test case:
// Verify that loading up a 'ckey' with an invalid checksum throws an error.
// Cut off the 32 byte checksum from a ckey record
records[ckey_record_key].resize(ckey_record_value.size() - 32);
// Fill in the checksum space with 0s
records[ckey_record_key].resize(ckey_record_value.size());
std::shared_ptr<CWallet> wallet(new CWallet(m_node.chain.get(), "", CreateMockableWalletDatabase(records)));
BOOST_CHECK_EQUAL(wallet->LoadWallet(), DBErrors::CORRUPT);
}
{
// Fourth test case:
// Verify that loading up a 'ckey' with an invalid pubkey throws an error
CPubKey invalid_key;
BOOST_CHECK(!invalid_key.IsValid());
SerializeData key = MakeSerializeData(DBKeys::CRYPTED_KEY, invalid_key);
records[key] = ckey_record_value;
std::shared_ptr<CWallet> wallet(new CWallet(m_node.chain.get(), "", CreateMockableWalletDatabase(records)));
BOOST_CHECK_EQUAL(wallet->LoadWallet(), DBErrors::CORRUPT);
}
}
BOOST_AUTO_TEST_SUITE_END()
} // namespace wallet

View file

@ -385,11 +385,11 @@ std::shared_ptr<CWallet> CreateWallet(WalletContext& context, const std::string&
uint64_t wallet_creation_flags = options.create_flags;
const SecureString& passphrase = options.create_passphrase;
ArgsManager& args = *Assert(context.args);
if (wallet_creation_flags & WALLET_FLAG_DESCRIPTORS) options.require_format = DatabaseFormat::SQLITE;
else if (args.GetBoolArg("-swapbdbendian", false)) {
options.require_format = DatabaseFormat::BERKELEY_SWAP;
else {
error = Untranslated("Legacy wallets can no longer be created");
status = DatabaseStatus::FAILED_CREATE;
return nullptr;
}
// Indicate that the wallet is actually supposed to be blank and not just blank to make it encrypted
@ -3052,6 +3052,9 @@ std::shared_ptr<CWallet> CWallet::Create(WalletContext& context, const std::stri
error = strprintf(_("Unexpected legacy entry in descriptor wallet found. Loading wallet %s\n\n"
"The wallet might have been tampered with or created with malicious intent.\n"), walletFile);
return nullptr;
} else if (nLoadWalletRet == DBErrors::LEGACY_WALLET) {
error = strprintf(_("Error loading %s: Wallet is a legacy wallet. Please migrate to a descriptor wallet using the migration tool (migratewallet RPC)."), walletFile);
return nullptr;
} else {
error = strprintf(_("Error loading %s"), walletFile);
return nullptr;
@ -3069,10 +3072,8 @@ std::shared_ptr<CWallet> CWallet::Create(WalletContext& context, const std::stri
walletInstance->InitWalletFlags(wallet_creation_flags);
// Only create LegacyScriptPubKeyMan when not descriptor wallet
if (!walletInstance->IsWalletFlagSet(WALLET_FLAG_DESCRIPTORS)) {
walletInstance->SetupLegacyScriptPubKeyMan();
}
// Only descriptor wallets can be created
assert(walletInstance->IsWalletFlagSet(WALLET_FLAG_DESCRIPTORS));
if ((wallet_creation_flags & WALLET_FLAG_EXTERNAL_SIGNER) || !(wallet_creation_flags & (WALLET_FLAG_DISABLE_PRIVATE_KEYS | WALLET_FLAG_BLANK_WALLET))) {
LOCK(walletInstance->cs_wallet);
@ -3911,9 +3912,11 @@ bool CWallet::IsLegacy() const
DescriptorScriptPubKeyMan* CWallet::GetDescriptorScriptPubKeyMan(const WalletDescriptor& desc) const
{
for (auto& spk_man_pair : m_spk_managers) {
auto spk_man_pair = m_spk_managers.find(desc.id);
if (spk_man_pair != m_spk_managers.end()) {
// Try to downcast to DescriptorScriptPubKeyMan then check if the descriptors match
DescriptorScriptPubKeyMan* spk_manager = dynamic_cast<DescriptorScriptPubKeyMan*>(spk_man_pair.second.get());
DescriptorScriptPubKeyMan* spk_manager = dynamic_cast<DescriptorScriptPubKeyMan*>(spk_man_pair->second.get());
if (spk_manager != nullptr && spk_manager->HasWalletDescriptor(desc)) {
return spk_manager;
}
@ -3946,7 +3949,7 @@ std::optional<bool> CWallet::IsInternalScriptPubKeyMan(ScriptPubKeyMan* spk_man)
return GetScriptPubKeyMan(*type, /* internal= */ true) == desc_spk_man;
}
ScriptPubKeyMan* CWallet::AddWalletDescriptor(WalletDescriptor& desc, const FlatSigningProvider& signing_provider, const std::string& label, bool internal)
util::Result<ScriptPubKeyMan*> CWallet::AddWalletDescriptor(WalletDescriptor& desc, const FlatSigningProvider& signing_provider, const std::string& label, bool internal)
{
AssertLockHeld(cs_wallet);
@ -3958,7 +3961,9 @@ ScriptPubKeyMan* CWallet::AddWalletDescriptor(WalletDescriptor& desc, const Flat
auto spk_man = GetDescriptorScriptPubKeyMan(desc);
if (spk_man) {
WalletLogPrintf("Update existing descriptor: %s\n", desc.descriptor->ToString());
spk_man->UpdateWalletDescriptor(desc);
if (auto spkm_res = spk_man->UpdateWalletDescriptor(desc); !spkm_res) {
return util::Error{util::ErrorString(spkm_res)};
}
} else {
auto new_spk_man = std::unique_ptr<DescriptorScriptPubKeyMan>(new DescriptorScriptPubKeyMan(*this, desc, m_keypool_size));
spk_man = new_spk_man.get();
@ -4356,7 +4361,9 @@ bool DoMigration(CWallet& wallet, WalletContext& context, bilingual_str& error,
// Add to the wallet
WalletDescriptor w_desc(std::move(descs.at(0)), creation_time, 0, 0, 0);
data->watchonly_wallet->AddWalletDescriptor(w_desc, keys, "", false);
if (auto spkm_res = data->watchonly_wallet->AddWalletDescriptor(w_desc, keys, "", false); !spkm_res) {
throw std::runtime_error(util::ErrorString(spkm_res).original);
}
}
// Add the wallet to settings
@ -4393,7 +4400,9 @@ bool DoMigration(CWallet& wallet, WalletContext& context, bilingual_str& error,
// Add to the wallet
WalletDescriptor w_desc(std::move(descs.at(0)), creation_time, 0, 0, 0);
data->solvable_wallet->AddWalletDescriptor(w_desc, keys, "", false);
if (auto spkm_res = data->solvable_wallet->AddWalletDescriptor(w_desc, keys, "", false); !spkm_res) {
throw std::runtime_error(util::ErrorString(spkm_res).original);
}
}
// Add the wallet to settings

View file

@ -1039,7 +1039,7 @@ public:
std::optional<bool> IsInternalScriptPubKeyMan(ScriptPubKeyMan* spk_man) const;
//! Add a descriptor to the wallet, return a ScriptPubKeyMan & associated output type
ScriptPubKeyMan* AddWalletDescriptor(WalletDescriptor& desc, const FlatSigningProvider& signing_provider, const std::string& label, bool internal) EXCLUSIVE_LOCKS_REQUIRED(cs_wallet);
util::Result<ScriptPubKeyMan*> AddWalletDescriptor(WalletDescriptor& desc, const FlatSigningProvider& signing_provider, const std::string& label, bool internal) EXCLUSIVE_LOCKS_REQUIRED(cs_wallet);
/** Move all records from the BDB database to a new SQLite database for storage.
* The original BDB file will be deleted and replaced with a new SQLite file.

View file

@ -485,6 +485,11 @@ static DBErrors LoadWalletFlags(CWallet* pwallet, DatabaseBatch& batch) EXCLUSIV
pwallet->WalletLogPrintf("Error reading wallet database: Unknown non-tolerable wallet flags found\n");
return DBErrors::TOO_NEW;
}
// All wallets must be descriptor wallets unless opened with a bdb_ro db
// bdb_ro is only used for legacy to descriptor migration.
if (pwallet->GetDatabase().Format() != "bdb_ro" && !pwallet->IsWalletFlagSet(WALLET_FLAG_DESCRIPTORS)) {
return DBErrors::LEGACY_WALLET;
}
}
return DBErrors::LOAD_OK;
}
@ -1432,7 +1437,7 @@ std::unique_ptr<WalletDatabase> MakeDatabase(const fs::path& path, const Databas
std::optional<DatabaseFormat> format;
if (exists) {
if (IsBDBFile(BDBDataFile(path))) {
format = DatabaseFormat::BERKELEY;
format = DatabaseFormat::BERKELEY_RO;
}
if (IsSQLiteFile(SQLiteDataFile(path))) {
if (format) {
@ -1460,9 +1465,11 @@ std::unique_ptr<WalletDatabase> MakeDatabase(const fs::path& path, const Databas
return nullptr;
}
// If BERKELEY was the format, then change the format from BERKELEY to BERKELEY_RO
if (format && options.require_format && format == DatabaseFormat::BERKELEY && options.require_format == DatabaseFormat::BERKELEY_RO) {
format = DatabaseFormat::BERKELEY_RO;
// BERKELEY_RO can only be opened if require_format was set, which only occurs in migration.
if (format && format == DatabaseFormat::BERKELEY_RO && (!options.require_format || options.require_format != DatabaseFormat::BERKELEY_RO)) {
error = Untranslated(strprintf("Failed to open database path '%s'. The wallet appears to be a Legacy wallet, please use the wallet migration tool (migratewallet RPC).", fs::PathToString(path)));
status = DatabaseStatus::FAILED_BAD_FORMAT;
return nullptr;
}
// A db already exists so format is set, but options also specifies the format, so make sure they agree
@ -1475,12 +1482,8 @@ std::unique_ptr<WalletDatabase> MakeDatabase(const fs::path& path, const Databas
// Format is not set when a db doesn't already exist, so use the format specified by the options if it is set.
if (!format && options.require_format) format = options.require_format;
// If the format is not specified or detected, choose the default format based on what is available. We prefer BDB over SQLite for now.
if (!format) {
format = DatabaseFormat::SQLITE;
#ifdef USE_BDB
format = DatabaseFormat::BERKELEY;
#endif
}
if (format == DatabaseFormat::SQLITE) {
@ -1491,15 +1494,8 @@ std::unique_ptr<WalletDatabase> MakeDatabase(const fs::path& path, const Databas
return MakeBerkeleyRODatabase(path, options, status, error);
}
#ifdef USE_BDB
if constexpr (true) {
return MakeBerkeleyDatabase(path, options, status, error);
} else
#endif
{
error = Untranslated(strprintf("Failed to open database path '%s'. Build does not support Berkeley DB database format.", fs::PathToString(path)));
status = DatabaseStatus::FAILED_BAD_FORMAT;
return nullptr;
}
error = Untranslated(STR_INTERNAL_BUG("Could not determine wallet format"));
status = DatabaseStatus::FAILED_BAD_FORMAT;
return nullptr;
}
} // namespace wallet

View file

@ -55,7 +55,8 @@ enum class DBErrors : int
UNKNOWN_DESCRIPTOR = 6,
LOAD_FAIL = 7,
UNEXPECTED_LEGACY_ENTRY = 8,
CORRUPT = 9,
LEGACY_WALLET = 9,
CORRUPT = 10,
};
namespace DBKeys {

View file

@ -10,7 +10,6 @@
#include <util/fs.h>
#include <util/translation.h>
#include <wallet/dump.h>
#include <wallet/salvage.h>
#include <wallet/wallet.h>
#include <wallet/walletutil.h>
@ -112,21 +111,29 @@ static void WalletShowInfo(CWallet* wallet_instance)
bool ExecuteWalletToolFunc(const ArgsManager& args, const std::string& command)
{
if (args.IsArgSet("-format") && command != "createfromdump") {
tfm::format(std::cerr, "The -format option can only be used with the \"createfromdump\" command.\n");
return false;
}
if (args.IsArgSet("-dumpfile") && command != "dump" && command != "createfromdump") {
tfm::format(std::cerr, "The -dumpfile option can only be used with the \"dump\" and \"createfromdump\" commands.\n");
return false;
}
if (args.IsArgSet("-descriptors") && command != "create") {
tfm::format(std::cerr, "The -descriptors option can only be used with the 'create' command.\n");
return false;
if (args.IsArgSet("-descriptors")) {
if (command != "create") {
tfm::format(std::cerr, "The -descriptors option can only be used with the 'create' command.\n");
return false;
}
if (!args.GetBoolArg("-descriptors", true)) {
tfm::format(std::cerr, "The -descriptors option must be set to \"true\"\n");
return false;
}
}
if (args.IsArgSet("-legacy") && command != "create") {
tfm::format(std::cerr, "The -legacy option can only be used with the 'create' command.\n");
return false;
if (args.IsArgSet("-legacy")) {
if (command != "create") {
tfm::format(std::cerr, "The -legacy option can only be used with the 'create' command.\n");
return false;
}
if (args.GetBoolArg("-legacy", true)) {
tfm::format(std::cerr, "The -legacy option must be set to \"false\"\n");
return false;
}
}
if (command == "create" && !args.IsArgSet("-wallet")) {
tfm::format(std::cerr, "Wallet name must be provided when creating a new wallet.\n");
@ -139,22 +146,8 @@ bool ExecuteWalletToolFunc(const ArgsManager& args, const std::string& command)
DatabaseOptions options;
ReadDatabaseArgs(args, options);
options.require_create = true;
// If -legacy is set, use it. Otherwise default to false.
bool make_legacy = args.GetBoolArg("-legacy", false);
// If neither -legacy nor -descriptors is set, default to true. If -descriptors is set, use its value.
bool make_descriptors = (!args.IsArgSet("-descriptors") && !args.IsArgSet("-legacy")) || (args.IsArgSet("-descriptors") && args.GetBoolArg("-descriptors", true));
if (make_legacy && make_descriptors) {
tfm::format(std::cerr, "Only one of -legacy or -descriptors can be set to true, not both\n");
return false;
}
if (!make_legacy && !make_descriptors) {
tfm::format(std::cerr, "One of -legacy or -descriptors must be set to true (or omitted)\n");
return false;
}
if (make_descriptors) {
options.create_flags |= WALLET_FLAG_DESCRIPTORS;
options.require_format = DatabaseFormat::SQLITE;
}
options.create_flags |= WALLET_FLAG_DESCRIPTORS;
options.require_format = DatabaseFormat::SQLITE;
const std::shared_ptr<CWallet> wallet_instance = MakeWallet(name, path, options);
if (wallet_instance) {
@ -169,24 +162,6 @@ bool ExecuteWalletToolFunc(const ArgsManager& args, const std::string& command)
if (!wallet_instance) return false;
WalletShowInfo(wallet_instance.get());
wallet_instance->Close();
} else if (command == "salvage") {
#ifdef USE_BDB
bilingual_str error;
std::vector<bilingual_str> warnings;
bool ret = RecoverDatabaseFile(args, path, error, warnings);
if (!ret) {
for (const auto& warning : warnings) {
tfm::format(std::cerr, "%s\n", warning.original);
}
if (!error.empty()) {
tfm::format(std::cerr, "%s\n", error.original);
}
}
return ret;
#else
tfm::format(std::cerr, "Salvage command is not available as BDB support is not compiled");
return false;
#endif
} else if (command == "dump") {
DatabaseOptions options;
ReadDatabaseArgs(args, options);

View file

@ -58,7 +58,7 @@ std::unique_ptr<CZMQNotificationInterface> CZMQNotificationInterface::Create(std
const auto& factory = entry.second;
for (std::string& address : gArgs.GetArgs(arg)) {
// libzmq uses prefix "ipc://" for UNIX domain sockets
if (address.substr(0, ADDR_PREFIX_UNIX.length()) == ADDR_PREFIX_UNIX) {
if (address.starts_with(ADDR_PREFIX_UNIX)) {
address.replace(0, ADDR_PREFIX_UNIX.length(), ADDR_PREFIX_IPC);
}

View file

@ -79,9 +79,6 @@ class ExampleTest(BitcoinTestFramework):
# Override the set_test_params(), skip_test_if_missing_module(), add_options(), setup_chain(), setup_network()
# and setup_nodes() methods to customize the test setup as required.
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
"""Override test parameters for your individual test.

View file

@ -45,9 +45,6 @@ SEQUENCE_LOCKTIME_MASK = 0x0000ffff
NOT_FINAL_ERROR = "non-BIP68-final"
class BIP68Test(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 2
self.extra_args = [
@ -58,6 +55,7 @@ class BIP68Test(BitcoinTestFramework):
'-testactivationheight=csv@432',
],
]
self.uses_wallet = None
def run_test(self):
self.relayfee = self.nodes[0].getnetworkinfo()["relayfee"]

View file

@ -18,9 +18,6 @@ from test_framework import util
class ConfArgsTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 1
@ -30,6 +27,7 @@ class ConfArgsTest(BitcoinTestFramework):
self.supports_cli = False
self.wallet_names = []
self.disable_autoconnect = False
self.uses_wallet = None
# Overridden to avoid attempt to sync not yet started nodes.
def setup_network(self):

View file

@ -13,12 +13,10 @@ from test_framework.test_node import (
)
class FilelockTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 2
self.uses_wallet = None
def setup_network(self):
self.add_nodes(self.num_nodes, extra_args=None)
@ -46,20 +44,12 @@ class FilelockTest(BitcoinTestFramework):
assert pid_file.exists()
if self.is_wallet_compiled():
def check_wallet_filelock(descriptors):
wallet_name = ''.join([random.choice(string.ascii_lowercase) for _ in range(6)])
self.nodes[0].createwallet(wallet_name=wallet_name, descriptors=descriptors)
wallet_dir = self.nodes[0].wallets_path
self.log.info("Check that we can't start a second bitcoind instance using the same wallet")
if descriptors:
expected_msg = f"Error: SQLiteDatabase: Unable to obtain an exclusive lock on the database, is it being used by another instance of {self.config['environment']['CLIENT_NAME']}?"
else:
expected_msg = "Error: Error initializing wallet database environment"
self.nodes[1].assert_start_raises_init_error(extra_args=[f'-walletdir={wallet_dir}', f'-wallet={wallet_name}', '-noserver'], expected_msg=expected_msg, match=ErrorMatch.PARTIAL_REGEX)
if self.is_bdb_compiled():
check_wallet_filelock(False)
check_wallet_filelock(True)
wallet_name = ''.join([random.choice(string.ascii_lowercase) for _ in range(6)])
self.nodes[0].createwallet(wallet_name=wallet_name)
wallet_dir = self.nodes[0].wallets_path
self.log.info("Check that we can't start a second bitcoind instance using the same wallet")
expected_msg = f"Error: SQLiteDatabase: Unable to obtain an exclusive lock on the database, is it being used by another instance of {self.config['environment']['CLIENT_NAME']}?"
self.nodes[1].assert_start_raises_init_error(extra_args=[f'-walletdir={wallet_dir}', f'-wallet={wallet_name}', '-noserver'], expected_msg=expected_msg, match=ErrorMatch.PARTIAL_REGEX)
if __name__ == '__main__':
FilelockTest(__file__).main()

View file

@ -24,12 +24,10 @@ class InitTest(BitcoinTestFramework):
subsequent starts.
"""
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = False
self.num_nodes = 1
self.uses_wallet = None
def init_stress_test(self):
"""

View file

@ -25,12 +25,10 @@ def notify_outputname(walletname, txid):
class NotificationsTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 2
self.setup_clean_chain = True
self.uses_wallet = None
def setup_network(self):
self.wallet = ''.join(chr(i) for i in range(FILE_CHAR_START, FILE_CHAR_END) if chr(i) not in FILE_CHARS_DISALLOWED)
@ -58,7 +56,6 @@ class NotificationsTest(BitcoinTestFramework):
def run_test(self):
if self.is_wallet_compiled():
# Setup the descriptors to be imported to the wallet
seed = "cTdGmKFWpbvpKQ7ejrdzqYT2hhjyb3GPHnLAK7wdi5Em67YLwSm9"
xpriv = "tprv8ZgxMBicQKsPfHCsTwkiM1KT56RXbGGTqvc2hgqzycpwbHqqpcajQeMRZoBD35kW4RtyCemu6j34Ku5DEspmgjKdt2qe4SvRch5Kk8B8A2v"
desc_imports = [{
"desc": descsum_create(f"wpkh({xpriv}/0/*)"),
@ -75,11 +72,8 @@ class NotificationsTest(BitcoinTestFramework):
# Make the wallets and import the descriptors
# Ensures that node 0 and node 1 share the same wallet for the conflicting transaction tests below.
for i, name in enumerate(self.wallet_names):
self.nodes[i].createwallet(wallet_name=name, descriptors=self.options.descriptors, blank=True, load_on_startup=True)
if self.options.descriptors:
self.nodes[i].importdescriptors(desc_imports)
else:
self.nodes[i].sethdseed(True, seed)
self.nodes[i].createwallet(wallet_name=name, blank=True, load_on_startup=True)
self.nodes[i].importdescriptors(desc_imports)
self.log.info("test -blocknotify")
block_count = 10

View file

@ -66,13 +66,11 @@ def calc_usage(blockdir):
return sum(os.path.getsize(blockdir + f) for f in os.listdir(blockdir) if os.path.isfile(os.path.join(blockdir, f))) / (1024. * 1024.)
class PruneTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 6
self.supports_cli = False
self.uses_wallet = None
# Create nodes 0 and 1 to mine.
# Create node 2 to test pruning.

View file

@ -20,9 +20,6 @@ from test_framework.address import ADDRESS_BCRT1_UNSPENDABLE
MAX_REPLACEMENT_LIMIT = 100
class ReplaceByFeeTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 2
self.extra_args = [
@ -37,6 +34,7 @@ class ReplaceByFeeTest(BitcoinTestFramework):
],
]
self.supports_cli = False
self.uses_wallet = None
def run_test(self):
self.wallet = MiniWallet(self.nodes[0])
@ -567,7 +565,7 @@ class ReplaceByFeeTest(BitcoinTestFramework):
assert_equal(json0["vin"][0]["sequence"], 4294967293)
assert_equal(json1["vin"][0]["sequence"], 4294967295)
if self.is_specified_wallet_compiled():
if self.is_wallet_compiled():
self.init_wallet(node=0)
rawtx2 = self.nodes[0].createrawtransaction([], outs)
frawtx2a = self.nodes[0].fundrawtransaction(rawtx2, {"replaceable": True})

View file

@ -7,9 +7,6 @@
from decimal import Decimal
from test_framework.address import (
key_to_p2pkh,
program_to_witness,
script_to_p2sh,
script_to_p2sh_p2wsh,
script_to_p2wsh,
)
@ -28,14 +25,11 @@ from test_framework.messages import (
)
from test_framework.script import (
CScript,
OP_0,
OP_1,
OP_DROP,
OP_TRUE,
)
from test_framework.script_util import (
key_to_p2pk_script,
key_to_p2pkh_script,
key_to_p2wpkh_script,
keys_to_multisig_script,
script_to_p2sh_script,
@ -47,7 +41,6 @@ from test_framework.util import (
assert_greater_than_or_equal,
assert_is_hex_string,
assert_raises_rpc_error,
try_rpc,
)
from test_framework.wallet_util import (
get_generate_key,
@ -78,9 +71,6 @@ txs_mined = {} # txindex from txid to blockhash
class SegWitTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 3
@ -159,18 +149,12 @@ class SegWitTest(BitcoinTestFramework):
assert_equal(self.nodes[i].deriveaddresses(sh_wpkh_desc)[0], key.p2sh_p2wpkh_addr)
assert_equal(self.nodes[i].deriveaddresses(wpkh_desc)[0], key.p2wpkh_addr)
if self.options.descriptors:
res = self.nodes[i].importdescriptors([
res = self.nodes[i].importdescriptors([
{"desc": p2sh_ms_desc, "timestamp": "now"},
{"desc": bip173_ms_desc, "timestamp": "now"},
{"desc": sh_wpkh_desc, "timestamp": "now"},
{"desc": wpkh_desc, "timestamp": "now"},
])
else:
# The nature of the legacy wallet is that this import results in also adding all of the necessary scripts
res = self.nodes[i].importmulti([
{"desc": p2sh_ms_desc, "timestamp": "now"},
])
assert all([r["success"] for r in res])
p2sh_ids.append([])
@ -315,286 +299,6 @@ class SegWitTest(BitcoinTestFramework):
# Mine a block to clear the gbt cache again.
self.generate(self.nodes[0], 1)
if not self.options.descriptors:
self.log.info("Verify behaviour of importaddress and listunspent")
# Some public keys to be used later
pubkeys = [
"0363D44AABD0F1699138239DF2F042C3282C0671CC7A76826A55C8203D90E39242", # cPiM8Ub4heR9NBYmgVzJQiUH1if44GSBGiqaeJySuL2BKxubvgwb
"02D3E626B3E616FC8662B489C123349FECBFC611E778E5BE739B257EAE4721E5BF", # cPpAdHaD6VoYbW78kveN2bsvb45Q7G5PhaPApVUGwvF8VQ9brD97
"04A47F2CBCEFFA7B9BCDA184E7D5668D3DA6F9079AD41E422FA5FD7B2D458F2538A62F5BD8EC85C2477F39650BD391EA6250207065B2A81DA8B009FC891E898F0E", # 91zqCU5B9sdWxzMt1ca3VzbtVm2YM6Hi5Rxn4UDtxEaN9C9nzXV
"02A47F2CBCEFFA7B9BCDA184E7D5668D3DA6F9079AD41E422FA5FD7B2D458F2538", # cPQFjcVRpAUBG8BA9hzr2yEzHwKoMgLkJZBBtK9vJnvGJgMjzTbd
"036722F784214129FEB9E8129D626324F3F6716555B603FFE8300BBCB882151228", # cQGtcm34xiLjB1v7bkRa4V3aAc9tS2UTuBZ1UnZGeSeNy627fN66
"0266A8396EE936BF6D99D17920DB21C6C7B1AB14C639D5CD72B300297E416FD2EC", # cTW5mR5M45vHxXkeChZdtSPozrFwFgmEvTNnanCW6wrqwaCZ1X7K
"0450A38BD7F0AC212FEBA77354A9B036A32E0F7C81FC4E0C5ADCA7C549C4505D2522458C2D9AE3CEFD684E039194B72C8A10F9CB9D4764AB26FCC2718D421D3B84", # 92h2XPssjBpsJN5CqSP7v9a7cf2kgDunBC6PDFwJHMACM1rrVBJ
]
# Import a compressed key and an uncompressed key, generate some multisig addresses
self.nodes[0].importprivkey("92e6XLo5jVAVwrQKPNTs93oQco8f8sDNBcpv73Dsrs397fQtFQn")
uncompressed_spendable_address = ["mvozP4UwyGD2mGZU4D2eMvMLPB9WkMmMQu"]
self.nodes[0].importprivkey("cNC8eQ5dg3mFAVePDX4ddmPYpPbw41r9bm2jd1nLJT77e6RrzTRR")
compressed_spendable_address = ["mmWQubrDomqpgSYekvsU7HWEVjLFHAakLe"]
assert not self.nodes[0].getaddressinfo(uncompressed_spendable_address[0])['iscompressed']
assert self.nodes[0].getaddressinfo(compressed_spendable_address[0])['iscompressed']
self.nodes[0].importpubkey(pubkeys[0])
compressed_solvable_address = [key_to_p2pkh(pubkeys[0])]
self.nodes[0].importpubkey(pubkeys[1])
compressed_solvable_address.append(key_to_p2pkh(pubkeys[1]))
self.nodes[0].importpubkey(pubkeys[2])
uncompressed_solvable_address = [key_to_p2pkh(pubkeys[2])]
spendable_anytime = [] # These outputs should be seen anytime after importprivkey and addmultisigaddress
spendable_after_importaddress = [] # These outputs should be seen after importaddress
solvable_after_importaddress = [] # These outputs should be seen after importaddress but not spendable
unsolvable_after_importaddress = [] # These outputs should be unsolvable after importaddress
solvable_anytime = [] # These outputs should be solvable after importpubkey
unseen_anytime = [] # These outputs should never be seen
uncompressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [uncompressed_spendable_address[0], compressed_spendable_address[0]])['address'])
uncompressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [uncompressed_spendable_address[0], uncompressed_spendable_address[0]])['address'])
compressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_spendable_address[0], compressed_spendable_address[0]])['address'])
uncompressed_solvable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_spendable_address[0], uncompressed_solvable_address[0]])['address'])
compressed_solvable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_spendable_address[0], compressed_solvable_address[0]])['address'])
compressed_solvable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_solvable_address[0], compressed_solvable_address[1]])['address'])
# Test multisig_without_privkey
# We have 2 public keys without private keys, use addmultisigaddress to add to wallet.
# Money sent to P2SH of multisig of this should only be seen after importaddress with the BASE58 P2SH address.
multisig_without_privkey_address = self.nodes[0].addmultisigaddress(2, [pubkeys[3], pubkeys[4]])['address']
script = keys_to_multisig_script([pubkeys[3], pubkeys[4]])
solvable_after_importaddress.append(script_to_p2sh_script(script))
for i in compressed_spendable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
# p2sh multisig with compressed keys should always be spendable
spendable_anytime.extend([p2sh])
# bare multisig can be watched and signed, but is not treated as ours
solvable_after_importaddress.extend([bare])
# P2WSH and P2SH(P2WSH) multisig with compressed keys are spendable after direct importaddress
spendable_after_importaddress.extend([p2wsh, p2sh_p2wsh])
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# normal P2PKH and P2PK with compressed keys should always be spendable
spendable_anytime.extend([p2pkh, p2pk])
# P2SH_P2PK, P2SH_P2PKH with compressed keys are spendable after direct importaddress
spendable_after_importaddress.extend([p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh])
# P2WPKH and P2SH_P2WPKH with compressed keys should always be spendable
spendable_anytime.extend([p2wpkh, p2sh_p2wpkh])
for i in uncompressed_spendable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
# p2sh multisig with uncompressed keys should always be spendable
spendable_anytime.extend([p2sh])
# bare multisig can be watched and signed, but is not treated as ours
solvable_after_importaddress.extend([bare])
# P2WSH and P2SH(P2WSH) multisig with uncompressed keys are never seen
unseen_anytime.extend([p2wsh, p2sh_p2wsh])
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# normal P2PKH and P2PK with uncompressed keys should always be spendable
spendable_anytime.extend([p2pkh, p2pk])
# P2SH_P2PK and P2SH_P2PKH are spendable after direct importaddress
spendable_after_importaddress.extend([p2sh_p2pk, p2sh_p2pkh])
# Witness output types with uncompressed keys are never seen
unseen_anytime.extend([p2wpkh, p2sh_p2wpkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh])
for i in compressed_solvable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
# Multisig without private is not seen after addmultisigaddress, but seen after importaddress
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
solvable_after_importaddress.extend([bare, p2sh, p2wsh, p2sh_p2wsh])
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# normal P2PKH, P2PK, P2WPKH and P2SH_P2WPKH with compressed keys should always be seen
solvable_anytime.extend([p2pkh, p2pk, p2wpkh, p2sh_p2wpkh])
# P2SH_P2PK, P2SH_P2PKH with compressed keys are seen after direct importaddress
solvable_after_importaddress.extend([p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh])
for i in uncompressed_solvable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
# Base uncompressed multisig without private is not seen after addmultisigaddress, but seen after importaddress
solvable_after_importaddress.extend([bare, p2sh])
# P2WSH and P2SH(P2WSH) multisig with uncompressed keys are never seen
unseen_anytime.extend([p2wsh, p2sh_p2wsh])
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# normal P2PKH and P2PK with uncompressed keys should always be seen
solvable_anytime.extend([p2pkh, p2pk])
# P2SH_P2PK, P2SH_P2PKH with uncompressed keys are seen after direct importaddress
solvable_after_importaddress.extend([p2sh_p2pk, p2sh_p2pkh])
# Witness output types with uncompressed keys are never seen
unseen_anytime.extend([p2wpkh, p2sh_p2wpkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh])
op1 = CScript([OP_1])
op0 = CScript([OP_0])
# 2N7MGY19ti4KDMSzRfPAssP6Pxyuxoi6jLe is the P2SH(P2PKH) version of mjoE3sSrb8ByYEvgnC3Aox86u1CHnfJA4V
unsolvable_address_key = bytes.fromhex("02341AEC7587A51CDE5279E0630A531AEA2615A9F80B17E8D9376327BAEAA59E3D")
unsolvablep2pkh = key_to_p2pkh_script(unsolvable_address_key)
unsolvablep2wshp2pkh = script_to_p2wsh_script(unsolvablep2pkh)
p2shop0 = script_to_p2sh_script(op0)
p2wshop1 = script_to_p2wsh_script(op1)
unsolvable_after_importaddress.append(unsolvablep2pkh)
unsolvable_after_importaddress.append(unsolvablep2wshp2pkh)
unsolvable_after_importaddress.append(op1) # OP_1 will be imported as script
unsolvable_after_importaddress.append(p2wshop1)
unseen_anytime.append(op0) # OP_0 will be imported as P2SH address with no script provided
unsolvable_after_importaddress.append(p2shop0)
spendable_txid = []
solvable_txid = []
spendable_txid.append(self.mine_and_test_listunspent(spendable_anytime, 2))
solvable_txid.append(self.mine_and_test_listunspent(solvable_anytime, 1))
self.mine_and_test_listunspent(spendable_after_importaddress + solvable_after_importaddress + unseen_anytime + unsolvable_after_importaddress, 0)
importlist = []
for i in compressed_spendable_address + uncompressed_spendable_address + compressed_solvable_address + uncompressed_solvable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
bare = bytes.fromhex(v['hex'])
importlist.append(bare.hex())
importlist.append(script_to_p2wsh_script(bare).hex())
else:
pubkey = bytes.fromhex(v['pubkey'])
p2pk = key_to_p2pk_script(pubkey)
p2pkh = key_to_p2pkh_script(pubkey)
importlist.append(p2pk.hex())
importlist.append(p2pkh.hex())
importlist.append(key_to_p2wpkh_script(pubkey).hex())
importlist.append(script_to_p2wsh_script(p2pk).hex())
importlist.append(script_to_p2wsh_script(p2pkh).hex())
importlist.append(unsolvablep2pkh.hex())
importlist.append(unsolvablep2wshp2pkh.hex())
importlist.append(op1.hex())
importlist.append(p2wshop1.hex())
for i in importlist:
# import all generated addresses. The wallet already has the private keys for some of these, so catch JSON RPC
# exceptions and continue.
try_rpc(-4, "The wallet already contains the private key for this address or script", self.nodes[0].importaddress, i, "", False, True)
self.nodes[0].importaddress(script_to_p2sh(op0)) # import OP_0 as address only
self.nodes[0].importaddress(multisig_without_privkey_address) # Test multisig_without_privkey
spendable_txid.append(self.mine_and_test_listunspent(spendable_anytime + spendable_after_importaddress, 2))
solvable_txid.append(self.mine_and_test_listunspent(solvable_anytime + solvable_after_importaddress, 1))
self.mine_and_test_listunspent(unsolvable_after_importaddress, 1)
self.mine_and_test_listunspent(unseen_anytime, 0)
spendable_txid.append(self.mine_and_test_listunspent(spendable_anytime + spendable_after_importaddress, 2))
solvable_txid.append(self.mine_and_test_listunspent(solvable_anytime + solvable_after_importaddress, 1))
self.mine_and_test_listunspent(unsolvable_after_importaddress, 1)
self.mine_and_test_listunspent(unseen_anytime, 0)
# Repeat some tests. This time we don't add witness scripts with importaddress
# Import a compressed key and an uncompressed key, generate some multisig addresses
self.nodes[0].importprivkey("927pw6RW8ZekycnXqBQ2JS5nPyo1yRfGNN8oq74HeddWSpafDJH")
uncompressed_spendable_address = ["mguN2vNSCEUh6rJaXoAVwY3YZwZvEmf5xi"]
self.nodes[0].importprivkey("cMcrXaaUC48ZKpcyydfFo8PxHAjpsYLhdsp6nmtB3E2ER9UUHWnw")
compressed_spendable_address = ["n1UNmpmbVUJ9ytXYXiurmGPQ3TRrXqPWKL"]
self.nodes[0].importpubkey(pubkeys[5])
compressed_solvable_address = [key_to_p2pkh(pubkeys[5])]
self.nodes[0].importpubkey(pubkeys[6])
uncompressed_solvable_address = [key_to_p2pkh(pubkeys[6])]
unseen_anytime = [] # These outputs should never be seen
solvable_anytime = [] # These outputs should be solvable after importpubkey
unseen_anytime = [] # These outputs should never be seen
uncompressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [uncompressed_spendable_address[0], compressed_spendable_address[0]])['address'])
uncompressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [uncompressed_spendable_address[0], uncompressed_spendable_address[0]])['address'])
compressed_spendable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_spendable_address[0], compressed_spendable_address[0]])['address'])
uncompressed_solvable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_solvable_address[0], uncompressed_solvable_address[0]])['address'])
compressed_solvable_address.append(self.nodes[0].addmultisigaddress(2, [compressed_spendable_address[0], compressed_solvable_address[0]])['address'])
premature_witaddress = []
for i in compressed_spendable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
premature_witaddress.append(script_to_p2sh(p2wsh))
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# P2WPKH, P2SH_P2WPKH are always spendable
spendable_anytime.extend([p2wpkh, p2sh_p2wpkh])
for i in uncompressed_spendable_address + uncompressed_solvable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
# P2WSH and P2SH(P2WSH) multisig with uncompressed keys are never seen
unseen_anytime.extend([p2wsh, p2sh_p2wsh])
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# P2WPKH, P2SH_P2WPKH with uncompressed keys are never seen
unseen_anytime.extend([p2wpkh, p2sh_p2wpkh])
for i in compressed_solvable_address:
v = self.nodes[0].getaddressinfo(i)
if v['isscript']:
[bare, p2sh, p2wsh, p2sh_p2wsh] = self.p2sh_address_to_script(v)
premature_witaddress.append(script_to_p2sh(p2wsh))
else:
[p2wpkh, p2sh_p2wpkh, p2pk, p2pkh, p2sh_p2pk, p2sh_p2pkh, p2wsh_p2pk, p2wsh_p2pkh, p2sh_p2wsh_p2pk, p2sh_p2wsh_p2pkh] = self.p2pkh_address_to_script(v)
# P2SH_P2PK, P2SH_P2PKH with compressed keys are always solvable
solvable_anytime.extend([p2wpkh, p2sh_p2wpkh])
self.mine_and_test_listunspent(spendable_anytime, 2)
self.mine_and_test_listunspent(solvable_anytime, 1)
self.mine_and_test_listunspent(unseen_anytime, 0)
# Check that createrawtransaction/decoderawtransaction with non-v0 Bech32 works
v1_addr = program_to_witness(1, [3, 5])
v1_tx = self.nodes[0].createrawtransaction([getutxo(spendable_txid[0])], {v1_addr: 1})
v1_decoded = self.nodes[1].decoderawtransaction(v1_tx)
assert_equal(v1_decoded['vout'][0]['scriptPubKey']['address'], v1_addr)
assert_equal(v1_decoded['vout'][0]['scriptPubKey']['hex'], "51020305")
# Check that spendable outputs are really spendable
self.create_and_mine_tx_from_txids(spendable_txid)
# import all the private keys so solvable addresses become spendable
self.nodes[0].importprivkey("cPiM8Ub4heR9NBYmgVzJQiUH1if44GSBGiqaeJySuL2BKxubvgwb")
self.nodes[0].importprivkey("cPpAdHaD6VoYbW78kveN2bsvb45Q7G5PhaPApVUGwvF8VQ9brD97")
self.nodes[0].importprivkey("91zqCU5B9sdWxzMt1ca3VzbtVm2YM6Hi5Rxn4UDtxEaN9C9nzXV")
self.nodes[0].importprivkey("cPQFjcVRpAUBG8BA9hzr2yEzHwKoMgLkJZBBtK9vJnvGJgMjzTbd")
self.nodes[0].importprivkey("cQGtcm34xiLjB1v7bkRa4V3aAc9tS2UTuBZ1UnZGeSeNy627fN66")
self.nodes[0].importprivkey("cTW5mR5M45vHxXkeChZdtSPozrFwFgmEvTNnanCW6wrqwaCZ1X7K")
self.create_and_mine_tx_from_txids(solvable_txid)
# Test that importing native P2WPKH/P2WSH scripts works
for use_p2wsh in [False, True]:
if use_p2wsh:
scriptPubKey = "00203a59f3f56b713fdcf5d1a57357f02c44342cbf306ffe0c4741046837bf90561a"
transaction = "01000000000100e1f505000000002200203a59f3f56b713fdcf5d1a57357f02c44342cbf306ffe0c4741046837bf90561a00000000"
else:
scriptPubKey = "a9142f8c469c2f0084c48e11f998ffbe7efa7549f26d87"
transaction = "01000000000100e1f5050000000017a9142f8c469c2f0084c48e11f998ffbe7efa7549f26d8700000000"
self.nodes[1].importaddress(scriptPubKey, "", False)
rawtxfund = self.nodes[1].fundrawtransaction(transaction)['hex']
rawtxfund = self.nodes[1].signrawtransactionwithwallet(rawtxfund)["hex"]
txid = self.nodes[1].sendrawtransaction(rawtxfund)
assert_equal(self.nodes[1].gettransaction(txid, True)["txid"], txid)
assert_equal(self.nodes[1].listtransactions("*", 1, 0, True)[0]["txid"], txid)
# Assert it is properly saved
self.restart_node(1)
assert_equal(self.nodes[1].gettransaction(txid, True)["txid"], txid)
assert_equal(self.nodes[1].listtransactions("*", 1, 0, True)[0]["txid"], txid)
def mine_and_test_listunspent(self, script_list, ismine):
utxo = find_spendable_utxo(self.nodes[0], 50)
tx = CTransaction()

View file

@ -13,13 +13,11 @@ from test_framework.util import assert_equal
class SettingsTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 1
self.wallet_names = []
self.uses_wallet = None
def test_wallet_settings(self, settings_path):
if not self.is_wallet_compiled():

View file

@ -1311,7 +1311,6 @@ UTXOData = namedtuple('UTXOData', 'outpoint,output,spender')
class TaprootTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
parser.add_argument("--dumptests", dest="dump_tests", default=False, action="store_true",
help="Dump generated test cases to directory set by TEST_DUMP_DIR environment variable")

View file

@ -73,12 +73,10 @@ def cli_get_info_string_to_dict(cli_get_info_string):
class TestBitcoinCli(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 1
self.uses_wallet = None
def skip_test_if_missing_module(self):
self.skip_if_no_cli()
@ -180,7 +178,7 @@ class TestBitcoinCli(BitcoinTestFramework):
assert_raises_process_error(1, "Invalid value for -color option. Valid values: always, auto, never.", self.nodes[0].cli('-getinfo', '-color=foo').send_cli)
self.log.info("Test -getinfo returns expected network and blockchain info")
if self.is_specified_wallet_compiled():
if self.is_wallet_compiled():
self.import_deterministic_coinbase_privkeys()
self.nodes[0].encryptwallet(password)
cli_get_info_string = self.nodes[0].cli('-getinfo').send_cli()
@ -206,7 +204,7 @@ class TestBitcoinCli(BitcoinTestFramework):
cli_get_info = cli_get_info_string_to_dict(cli_get_info_string)
assert_equal(cli_get_info["Proxies"], "127.0.0.1:9050 (ipv4, ipv6, onion, cjdns), 127.0.0.1:7656 (i2p)")
if self.is_specified_wallet_compiled():
if self.is_wallet_compiled():
self.log.info("Test -getinfo and bitcoin-cli getwalletinfo return expected wallet info")
# Explicitly set the output type in order to have consistent tx vsize / fees
# for both legacy and descriptor wallets (disables the change address type detection algorithm)

View file

@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright (c) 2022 The Bitcoin Core developers
# Copyright (c) 2022-present The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
@ -17,6 +17,7 @@ from test_framework.util import (
assert_equal,
assert_greater_than,
assert_raises_rpc_error,
bpf_cflags,
)
coinselection_tracepoints_program = """
@ -106,9 +107,6 @@ int trace_aps_create_tx(struct pt_regs *ctx) {
class CoinSelectionTracepointTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 1
self.setup_clean_chain = True
@ -175,7 +173,7 @@ class CoinSelectionTracepointTest(BitcoinTestFramework):
ctx.enable_probe(probe="coin_selection:normal_create_tx_internal", fn_name="trace_normal_create_tx")
ctx.enable_probe(probe="coin_selection:attempting_aps_create_tx", fn_name="trace_attempt_aps")
ctx.enable_probe(probe="coin_selection:aps_create_tx_internal", fn_name="trace_aps_create_tx")
self.bpf = BPF(text=coinselection_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
self.bpf = BPF(text=coinselection_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
self.log.info("Prepare wallets")
self.generate(self.nodes[0], 101)

View file

@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright (c) 2022 The Bitcoin Core developers
# Copyright (c) 2022-present The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
@ -20,7 +20,10 @@ from test_framework.blocktools import COINBASE_MATURITY
from test_framework.messages import COIN, DEFAULT_MEMPOOL_EXPIRY_HOURS
from test_framework.p2p import P2PDataStore
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import assert_equal
from test_framework.util import (
assert_equal,
bpf_cflags,
)
from test_framework.wallet import MiniWallet
MEMPOOL_TRACEPOINTS_PROGRAM = """
@ -166,7 +169,7 @@ class MempoolTracepointTest(BitcoinTestFramework):
node = self.nodes[0]
ctx = USDT(pid=node.process.pid)
ctx.enable_probe(probe="mempool:added", fn_name="trace_added")
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
def handle_added_event(_, data, __):
events.append(bpf["added_events"].event(data))
@ -203,7 +206,7 @@ class MempoolTracepointTest(BitcoinTestFramework):
node = self.nodes[0]
ctx = USDT(pid=node.process.pid)
ctx.enable_probe(probe="mempool:removed", fn_name="trace_removed")
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
def handle_removed_event(_, data, __):
events.append(bpf["removed_events"].event(data))
@ -249,7 +252,7 @@ class MempoolTracepointTest(BitcoinTestFramework):
node = self.nodes[0]
ctx = USDT(pid=node.process.pid)
ctx.enable_probe(probe="mempool:replaced", fn_name="trace_replaced")
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
def handle_replaced_event(_, data, __):
event = ctypes.cast(data, ctypes.POINTER(MempoolReplaced)).contents
@ -302,7 +305,7 @@ class MempoolTracepointTest(BitcoinTestFramework):
self.log.info("Hooking into mempool:rejected tracepoint...")
ctx = USDT(pid=node.process.pid)
ctx.enable_probe(probe="mempool:rejected", fn_name="trace_rejected")
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=MEMPOOL_TRACEPOINTS_PROGRAM, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
def handle_rejected_event(_, data, __):
events.append(bpf["rejected_events"].event(data))

View file

@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright (c) 2022 The Bitcoin Core developers
# Copyright (c) 2022-present The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
@ -17,7 +17,11 @@ except ImportError:
from test_framework.messages import CBlockHeader, MAX_HEADERS_RESULTS, msg_headers, msg_version
from test_framework.p2p import P2PInterface
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import assert_equal, assert_greater_than
from test_framework.util import (
assert_equal,
assert_greater_than,
bpf_cflags,
)
# Tor v3 addresses are 62 chars + 6 chars for the port (':12345').
MAX_PEER_ADDR_LENGTH = 68
@ -283,7 +287,7 @@ class NetTracepointTest(BitcoinTestFramework):
fn_name="trace_inbound_message")
ctx.enable_probe(probe="net:outbound_message",
fn_name="trace_outbound_message")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
EXPECTED_INOUTBOUND_VERSION_MSG = 1
checked_inbound_version_msg = 0
@ -341,7 +345,7 @@ class NetTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="net:inbound_connection",
fn_name="trace_inbound_connection")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
inbound_connections = []
EXPECTED_INBOUND_CONNECTIONS = 2
@ -378,7 +382,7 @@ class NetTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="net:outbound_connection",
fn_name="trace_outbound_connection")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
# that the handle_* function succeeds.
EXPECTED_OUTBOUND_CONNECTIONS = 2
@ -419,7 +423,7 @@ class NetTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="net:evicted_inbound_connection",
fn_name="trace_evicted_inbound_connection")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
EXPECTED_EVICTED_CONNECTIONS = 2
evicted_connections = []
@ -456,7 +460,7 @@ class NetTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="net:misbehaving_connection",
fn_name="trace_misbehaving_connection")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
EXPECTED_MISBEHAVING_CONNECTIONS = 2
misbehaving_connections = []
@ -490,7 +494,7 @@ class NetTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="net:closed_connection",
fn_name="trace_closed_connection")
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=net_tracepoints_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
EXPECTED_CLOSED_CONNECTIONS = 2
closed_connections = []

View file

@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright (c) 2022 The Bitcoin Core developers
# Copyright (c) 2022-present The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
@ -15,7 +15,10 @@ except ImportError:
pass
from test_framework.messages import COIN
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import assert_equal
from test_framework.util import (
assert_equal,
bpf_cflags,
)
from test_framework.wallet import MiniWallet
utxocache_changes_program = """
@ -181,7 +184,7 @@ class UTXOCacheTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="utxocache:uncache",
fn_name="trace_utxocache_uncache")
bpf = BPF(text=utxocache_changes_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=utxocache_changes_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
# The handle_* function is a ctypes callback function called from C. When
# we assert in the handle_* function, the AssertError doesn't propagate
@ -250,7 +253,7 @@ class UTXOCacheTracepointTest(BitcoinTestFramework):
ctx.enable_probe(probe="utxocache:add", fn_name="trace_utxocache_add")
ctx.enable_probe(probe="utxocache:spent",
fn_name="trace_utxocache_spent")
bpf = BPF(text=utxocache_changes_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=utxocache_changes_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
# The handle_* function is a ctypes callback function called from C. When
# we assert in the handle_* function, the AssertError doesn't propagate
@ -350,7 +353,7 @@ class UTXOCacheTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="utxocache:flush",
fn_name="trace_utxocache_flush")
bpf = BPF(text=utxocache_flushes_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=utxocache_flushes_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
# The handle_* function is a ctypes callback function called from C. When
# we assert in the handle_* function, the AssertError doesn't propagate
@ -407,7 +410,7 @@ class UTXOCacheTracepointTest(BitcoinTestFramework):
ctx = USDT(pid=self.nodes[0].process.pid)
ctx.enable_probe(probe="utxocache:flush",
fn_name="trace_utxocache_flush")
bpf = BPF(text=utxocache_flushes_program, usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
bpf = BPF(text=utxocache_flushes_program, usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
bpf["utxocache_flush"].open_perf_buffer(handle_utxocache_flush)
self.log.info("prune blockchain to trigger a flush for pruning")

View file

@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright (c) 2022 The Bitcoin Core developers
# Copyright (c) 2022-present The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
@ -18,8 +18,10 @@ except ImportError:
from test_framework.address import ADDRESS_BCRT1_UNSPENDABLE
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import assert_equal
from test_framework.util import (
assert_equal,
bpf_cflags,
)
validation_blockconnected_program = """
#include <uapi/linux/ptrace.h>
@ -97,7 +99,7 @@ class ValidationTracepointTest(BitcoinTestFramework):
ctx.enable_probe(probe="validation:block_connected",
fn_name="trace_block_connected")
bpf = BPF(text=validation_blockconnected_program,
usdt_contexts=[ctx], debug=0, cflags=["-Wno-error=implicit-function-declaration"])
usdt_contexts=[ctx], debug=0, cflags=bpf_cflags())
def handle_blockconnected(_, data, __):
event = ctypes.cast(data, ctypes.POINTER(Block)).contents

View file

@ -50,12 +50,10 @@ from test_framework.wallet import MiniWallet, COIN
class MempoolPersistTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser, legacy=False)
def set_test_params(self):
self.num_nodes = 3
self.extra_args = [[], ["-persistmempool=0"], []]
self.uses_wallet = None
def run_test(self):
self.mini_wallet = MiniWallet(self.nodes[2])

View file

@ -16,11 +16,9 @@ from test_framework.wallet import MiniWallet
MAX_INITIAL_BROADCAST_DELAY = 15 * 60 # 15 minutes in seconds
class MempoolUnbroadcastTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 2
self.uses_wallet = None
def run_test(self):
self.wallet = MiniWallet(self.nodes[0])

View file

@ -52,8 +52,6 @@ class MiningMainnetTest(BitcoinTestFramework):
help='Block data file (default: %(default)s)',
)
self.add_wallet_options(parser)
def mine(self, height, prev_hash, blocks, node, fees=0):
self.log.debug(f"height={height}")
block = CBlock()

View file

@ -9,7 +9,7 @@ import json
import os
from test_framework.address import address_to_scriptpubkey
from test_framework.descriptors import descsum_create, drop_origins
from test_framework.descriptors import descsum_create
from test_framework.key import ECPubKey
from test_framework.messages import COIN
from test_framework.script_util import keys_to_multisig_script
@ -25,14 +25,10 @@ from test_framework.wallet import (
)
class RpcCreateMultiSigTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 3
self.supports_cli = False
self.enable_wallet_if_possible()
def create_keys(self, num_keys):
self.pub = []
@ -42,29 +38,20 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
self.pub.append(pubkey.hex())
self.priv.append(privkey)
def create_wallet(self, node, wallet_name):
node.createwallet(wallet_name=wallet_name, disable_private_keys=True)
return node.get_wallet_rpc(wallet_name)
def run_test(self):
node0, node1, _node2 = self.nodes
self.wallet = MiniWallet(test_node=node0)
if self.is_wallet_compiled():
self.check_addmultisigaddress_errors()
self.log.info('Generating blocks ...')
self.generate(self.wallet, 149)
wallet_multi = self.create_wallet(node1, 'wmulti') if self._requires_wallet else None
self.create_keys(21) # max number of allowed keys + 1
m_of_n = [(2, 3), (3, 3), (2, 5), (3, 5), (10, 15), (15, 15)]
for (sigs, keys) in m_of_n:
for output_type in ["bech32", "p2sh-segwit", "legacy"]:
self.do_multisig(keys, sigs, output_type, wallet_multi)
self.do_multisig(keys, sigs, output_type)
self.test_multisig_script_limit(wallet_multi)
self.test_mixing_uncompressed_and_compressed_keys(node0, wallet_multi)
self.test_mixing_uncompressed_and_compressed_keys(node0)
self.test_sortedmulti_descriptors_bip67()
# Check that bech32m is currently not allowed
@ -80,22 +67,7 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
res = self.nodes[0].createmultisig(nrequired=nkeys, keys=keys, address_type='bech32')
assert_equal(res['redeemScript'], expected_ms_script.hex())
def check_addmultisigaddress_errors(self):
if self.options.descriptors:
return
self.log.info('Check that addmultisigaddress fails when the private keys are missing')
addresses = [self.nodes[1].getnewaddress(address_type='legacy') for _ in range(2)]
assert_raises_rpc_error(-5, 'no full public key for address', lambda: self.nodes[0].addmultisigaddress(nrequired=1, keys=addresses))
for a in addresses:
# Importing all addresses should not change the result
self.nodes[0].importaddress(a)
assert_raises_rpc_error(-5, 'no full public key for address', lambda: self.nodes[0].addmultisigaddress(nrequired=1, keys=addresses))
# Bech32m address type is disallowed for legacy wallets
pubs = [self.nodes[1].getaddressinfo(addr)["pubkey"] for addr in addresses]
assert_raises_rpc_error(-5, "Bech32m multisig addresses cannot be created with legacy wallets", self.nodes[0].addmultisigaddress, 2, pubs, "", "bech32m")
def test_multisig_script_limit(self, wallet_multi):
def test_multisig_script_limit(self):
node1 = self.nodes[1]
pubkeys = self.pub[0:20]
@ -103,25 +75,14 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
assert_raises_rpc_error(-8, "redeemScript exceeds size limit: 684 > 520", node1.createmultisig, 16, pubkeys, 'legacy')
self.log.info('Test valid 16-20 multisig p2sh-legacy and bech32 (no wallet)')
self.do_multisig(nkeys=20, nsigs=16, output_type="p2sh-segwit", wallet_multi=None)
self.do_multisig(nkeys=20, nsigs=16, output_type="bech32", wallet_multi=None)
self.do_multisig(nkeys=20, nsigs=16, output_type="p2sh-segwit")
self.do_multisig(nkeys=20, nsigs=16, output_type="bech32")
self.log.info('Test invalid 16-21 multisig p2sh-legacy and bech32 (no wallet)')
assert_raises_rpc_error(-8, "Number of keys involved in the multisignature address creation > 20", node1.createmultisig, 16, self.pub, 'p2sh-segwit')
assert_raises_rpc_error(-8, "Number of keys involved in the multisignature address creation > 20", node1.createmultisig, 16, self.pub, 'bech32')
# Check legacy wallet related command
self.log.info('Test legacy redeem script max size limit (with wallet)')
if wallet_multi is not None and not self.options.descriptors:
assert_raises_rpc_error(-8, "redeemScript exceeds size limit: 684 > 520", wallet_multi.addmultisigaddress, 16, pubkeys, '', 'legacy')
self.log.info('Test legacy wallet unsupported operation. 16-20 multisig p2sh-legacy and bech32 generation')
# Due an internal limitation on legacy wallets, the redeem script limit also applies to p2sh-segwit and bech32 (even when the scripts are valid)
# We take this as a "good thing" to tell users to upgrade to descriptors.
assert_raises_rpc_error(-4, "Unsupported multisig script size for legacy wallet. Upgrade to descriptors to overcome this limitation for p2sh-segwit or bech32 scripts", wallet_multi.addmultisigaddress, 16, pubkeys, '', 'p2sh-segwit')
assert_raises_rpc_error(-4, "Unsupported multisig script size for legacy wallet. Upgrade to descriptors to overcome this limitation for p2sh-segwit or bech32 scripts", wallet_multi.addmultisigaddress, 16, pubkeys, '', 'bech32')
def do_multisig(self, nkeys, nsigs, output_type, wallet_multi):
def do_multisig(self, nkeys, nsigs, output_type):
node0, _node1, node2 = self.nodes
pub_keys = self.pub[0: nkeys]
priv_keys = self.priv[0: nkeys]
@ -144,16 +105,6 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
if output_type == 'bech32':
assert madd[0:4] == "bcrt" # actually a bech32 address
if wallet_multi is not None:
# compare against addmultisigaddress
msigw = wallet_multi.addmultisigaddress(nsigs, pub_keys, None, output_type)
maddw = msigw["address"]
mredeemw = msigw["redeemScript"]
assert_equal(desc, drop_origins(msigw['descriptor']))
# addmultisigiaddress and createmultisig work the same
assert maddw == madd
assert mredeemw == mredeem
spk = address_to_scriptpubkey(madd)
value = decimal.Decimal("0.00004000")
tx = self.wallet.send_to(from_node=self.nodes[0], scriptPubKey=spk, amount=int(value * COIN))
@ -162,9 +113,7 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
self.generate(node0, 1)
outval = value - decimal.Decimal("0.00002000") # deduce fee (must be higher than the min relay fee)
# send coins to node2 when wallet is enabled
node2_balance = node2.getbalances()['mine']['trusted'] if self.is_wallet_compiled() else 0
out_addr = node2.getnewaddress() if self.is_wallet_compiled() else getnewdestination('bech32')[2]
out_addr = getnewdestination('bech32')[2]
rawtx = node2.createrawtransaction([{"txid": tx["txid"], "vout": tx["sent_vout"]}], [{out_addr: outval}])
prevtx_err = dict(prevtxs[0])
@ -207,14 +156,10 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
assert_raises_rpc_error(-25, "Input not found or already spent", node2.combinerawtransaction, [rawtx2['hex'], rawtx3['hex']])
# When the wallet is enabled, assert node2 sees the incoming amount
if self.is_wallet_compiled():
assert_equal(node2.getbalances()['mine']['trusted'], node2_balance + outval)
txinfo = node0.getrawtransaction(tx, True, blk)
self.log.info("n/m=%d/%d %s size=%d vsize=%d weight=%d" % (nsigs, nkeys, output_type, txinfo["size"], txinfo["vsize"], txinfo["weight"]))
def test_mixing_uncompressed_and_compressed_keys(self, node, wallet_multi):
def test_mixing_uncompressed_and_compressed_keys(self, node):
self.log.info('Mixed compressed and uncompressed multisigs are not allowed')
pk0, pk1, pk2 = [getnewdestination('bech32')[0].hex() for _ in range(3)]
@ -229,12 +174,6 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
# Results should be the same as this legacy one
legacy_addr = node.createmultisig(2, keys, 'legacy')['address']
if wallet_multi is not None:
# 'addmultisigaddress' should return the same address
result = wallet_multi.addmultisigaddress(2, keys, '', 'legacy')
assert_equal(legacy_addr, result['address'])
assert 'warnings' not in result
# Generate addresses with the segwit types. These should all make legacy addresses
err_msg = ["Unable to make chosen address type, please ensure no uncompressed public keys are present."]
@ -243,11 +182,6 @@ class RpcCreateMultiSigTest(BitcoinTestFramework):
assert_equal(legacy_addr, result['address'])
assert_equal(result['warnings'], err_msg)
if wallet_multi is not None:
result = wallet_multi.addmultisigaddress(nrequired=2, keys=keys, address_type=addr_type)
assert_equal(legacy_addr, result['address'])
assert_equal(result['warnings'], err_msg)
def test_sortedmulti_descriptors_bip67(self):
self.log.info('Testing sortedmulti descriptors with BIP 67 test vectors')
with open(os.path.join(os.path.dirname(os.path.realpath(__file__)), 'data/rpc_bip67.json'), encoding='utf-8') as f:

View file

@ -7,13 +7,11 @@ from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import assert_raises_rpc_error
class DeprecatedRpcTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 1
self.setup_clean_chain = True
self.extra_args = [[]]
self.uses_wallet = None
def run_test(self):
# This test should be used to verify the errors of the currently

View file

@ -43,12 +43,10 @@ def process_mapping(fname):
class HelpRpcTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 1
self.supports_cli = False
self.uses_wallet = None
def run_test(self):
self.test_client_conversion_table()

View file

@ -40,12 +40,10 @@ INVALID_ADDRESS = 'asfah14i8fajz0123f'
INVALID_ADDRESS_2 = '1q049ldschfnwystcqnsvyfpj23mpsg3jcedq9xv'
class InvalidAddressErrorMessageTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 1
self.uses_wallet = None
def check_valid(self, addr):
info = self.nodes[0].validateaddress(addr)

View file

@ -56,9 +56,6 @@ import os
class PSBTTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser)
def set_test_params(self):
self.num_nodes = 3
self.extra_args = [
@ -114,7 +111,7 @@ class PSBTTest(BitcoinTestFramework):
# Mine a transaction that credits the offline address
offline_addr = offline_node.getnewaddress(address_type="bech32m")
online_addr = w2.getnewaddress(address_type="bech32m")
wonline.importaddress(offline_addr, "", False)
wonline.importaddress(offline_addr, label="", rescan=False)
mining_wallet = mining_node.get_wallet_rpc(self.default_wallet_name)
mining_wallet.sendtoaddress(address=offline_addr, amount=1.0)
self.generate(mining_node, nblocks=1, sync_fun=lambda: self.sync_all([online_node, mining_node]))
@ -315,13 +312,9 @@ class PSBTTest(BitcoinTestFramework):
wmulti = self.nodes[2].get_wallet_rpc('wmulti')
# Create all the addresses
p2sh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], "", "legacy")['address']
p2wsh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], "", "bech32")['address']
p2sh_p2wsh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], "", "p2sh-segwit")['address']
if not self.options.descriptors:
wmulti.importaddress(p2sh)
wmulti.importaddress(p2wsh)
wmulti.importaddress(p2sh_p2wsh)
p2sh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], label="", address_type="legacy")["address"]
p2wsh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], label="", address_type="bech32")["address"]
p2sh_p2wsh = wmulti.addmultisigaddress(2, [pubkey0, pubkey1, pubkey2], label="", address_type="p2sh-segwit")["address"]
p2wpkh = self.nodes[1].getnewaddress("", "bech32")
p2pkh = self.nodes[1].getnewaddress("", "legacy")
p2sh_p2wpkh = self.nodes[1].getnewaddress("", "p2sh-segwit")
@ -655,8 +648,7 @@ class PSBTTest(BitcoinTestFramework):
for i, signer in enumerate(signers):
self.nodes[2].unloadwallet("wallet{}".format(i))
if self.options.descriptors:
self.test_utxo_conversion()
self.test_utxo_conversion()
self.test_psbt_incomplete_after_invalid_modification()
self.test_input_confs_control()
@ -782,10 +774,7 @@ class PSBTTest(BitcoinTestFramework):
# Make a weird but signable script. sh(wsh(pkh())) descriptor accomplishes this
desc = descsum_create("sh(wsh(pkh({})))".format(privkey))
if self.options.descriptors:
res = self.nodes[0].importdescriptors([{"desc": desc, "timestamp": "now"}])
else:
res = self.nodes[0].importmulti([{"desc": desc, "timestamp": "now"}])
res = self.nodes[0].importdescriptors([{"desc": desc, "timestamp": "now"}])
assert res[0]["success"]
addr = self.nodes[0].deriveaddresses(desc)[0]
addr_info = self.nodes[0].getaddressinfo(addr)
@ -867,10 +856,7 @@ class PSBTTest(BitcoinTestFramework):
assert_equal(psbt2["fee"], psbt3["fee"])
# Import the external utxo descriptor so that we can sign for it from the test wallet
if self.options.descriptors:
res = wallet.importdescriptors([{"desc": desc, "timestamp": "now"}])
else:
res = wallet.importmulti([{"desc": desc, "timestamp": "now"}])
res = wallet.importdescriptors([{"desc": desc, "timestamp": "now"}])
assert res[0]["success"]
# The provided weight should override the calculated weight for a wallet input
psbt3 = wallet.walletcreatefundedpsbt(
@ -887,10 +873,7 @@ class PSBTTest(BitcoinTestFramework):
privkey, pubkey = generate_keypair(wif=True)
desc = descsum_create("wsh(pkh({}))".format(pubkey.hex()))
if self.options.descriptors:
res = watchonly.importdescriptors([{"desc": desc, "timestamp": "now"}])
else:
res = watchonly.importmulti([{"desc": desc, "timestamp": "now"}])
res = watchonly.importdescriptors([{"desc": desc, "timestamp": "now"}])
assert res[0]["success"]
addr = self.nodes[0].deriveaddresses(desc)[0]
self.nodes[0].sendtoaddress(addr, 10)
@ -902,46 +885,45 @@ class PSBTTest(BitcoinTestFramework):
self.nodes[0].sendrawtransaction(signed_tx["hex"])
# Same test but for taproot
if self.options.descriptors:
privkey, pubkey = generate_keypair(wif=True)
privkey, pubkey = generate_keypair(wif=True)
desc = descsum_create("tr({},pk({}))".format(H_POINT, pubkey.hex()))
res = watchonly.importdescriptors([{"desc": desc, "timestamp": "now"}])
assert res[0]["success"]
addr = self.nodes[0].deriveaddresses(desc)[0]
self.nodes[0].sendtoaddress(addr, 10)
self.generate(self.nodes[0], 1)
self.nodes[0].importdescriptors([{"desc": descsum_create("tr({})".format(privkey)), "timestamp":"now"}])
desc = descsum_create("tr({},pk({}))".format(H_POINT, pubkey.hex()))
res = watchonly.importdescriptors([{"desc": desc, "timestamp": "now"}])
assert res[0]["success"]
addr = self.nodes[0].deriveaddresses(desc)[0]
self.nodes[0].sendtoaddress(addr, 10)
self.generate(self.nodes[0], 1)
self.nodes[0].importdescriptors([{"desc": descsum_create("tr({})".format(privkey)), "timestamp":"now"}])
psbt = watchonly.sendall([wallet.getnewaddress(), addr])["psbt"]
processed_psbt = self.nodes[0].walletprocesspsbt(psbt)
txid = self.nodes[0].sendrawtransaction(processed_psbt["hex"])
vout = find_vout_for_address(self.nodes[0], txid, addr)
psbt = watchonly.sendall([wallet.getnewaddress(), addr])["psbt"]
processed_psbt = self.nodes[0].walletprocesspsbt(psbt)
txid = self.nodes[0].sendrawtransaction(processed_psbt["hex"])
vout = find_vout_for_address(self.nodes[0], txid, addr)
# Make sure tap tree is in psbt
parsed_psbt = PSBT.from_base64(psbt)
assert_greater_than(len(parsed_psbt.o[vout].map[PSBT_OUT_TAP_TREE]), 0)
assert "taproot_tree" in self.nodes[0].decodepsbt(psbt)["outputs"][vout]
parsed_psbt.make_blank()
comb_psbt = self.nodes[0].combinepsbt([psbt, parsed_psbt.to_base64()])
assert_equal(comb_psbt, psbt)
# Make sure tap tree is in psbt
parsed_psbt = PSBT.from_base64(psbt)
assert_greater_than(len(parsed_psbt.o[vout].map[PSBT_OUT_TAP_TREE]), 0)
assert "taproot_tree" in self.nodes[0].decodepsbt(psbt)["outputs"][vout]
parsed_psbt.make_blank()
comb_psbt = self.nodes[0].combinepsbt([psbt, parsed_psbt.to_base64()])
assert_equal(comb_psbt, psbt)
self.log.info("Test that walletprocesspsbt both updates and signs a non-updated psbt containing Taproot inputs")
addr = self.nodes[0].getnewaddress("", "bech32m")
utxo = self.create_outpoints(self.nodes[0], outputs=[{addr: 1}])[0]
psbt = self.nodes[0].createpsbt([utxo], [{self.nodes[0].getnewaddress(): 0.9999}])
signed = self.nodes[0].walletprocesspsbt(psbt)
rawtx = signed["hex"]
self.nodes[0].sendrawtransaction(rawtx)
self.generate(self.nodes[0], 1)
self.log.info("Test that walletprocesspsbt both updates and signs a non-updated psbt containing Taproot inputs")
addr = self.nodes[0].getnewaddress("", "bech32m")
utxo = self.create_outpoints(self.nodes[0], outputs=[{addr: 1}])[0]
psbt = self.nodes[0].createpsbt([utxo], [{self.nodes[0].getnewaddress(): 0.9999}])
signed = self.nodes[0].walletprocesspsbt(psbt)
rawtx = signed["hex"]
self.nodes[0].sendrawtransaction(rawtx)
self.generate(self.nodes[0], 1)
# Make sure tap tree is not in psbt
parsed_psbt = PSBT.from_base64(psbt)
assert PSBT_OUT_TAP_TREE not in parsed_psbt.o[0].map
assert "taproot_tree" not in self.nodes[0].decodepsbt(psbt)["outputs"][0]
parsed_psbt.make_blank()
comb_psbt = self.nodes[0].combinepsbt([psbt, parsed_psbt.to_base64()])
assert_equal(comb_psbt, psbt)
# Make sure tap tree is not in psbt
parsed_psbt = PSBT.from_base64(psbt)
assert PSBT_OUT_TAP_TREE not in parsed_psbt.o[0].map
assert "taproot_tree" not in self.nodes[0].decodepsbt(psbt)["outputs"][0]
parsed_psbt.make_blank()
comb_psbt = self.nodes[0].combinepsbt([psbt, parsed_psbt.to_base64()])
assert_equal(comb_psbt, psbt)
self.log.info("Test walletprocesspsbt raises if an invalid sighashtype is passed")
assert_raises_rpc_error(-8, "'all' is not a valid sighash parameter.", self.nodes[0].walletprocesspsbt, psbt, sighashtype="all")

View file

@ -64,9 +64,6 @@ class multidict(dict):
class RawTransactionsTest(BitcoinTestFramework):
def add_options(self, parser):
self.add_wallet_options(parser, descriptors=False)
def set_test_params(self):
self.num_nodes = 3
self.extra_args = [
@ -77,6 +74,7 @@ class RawTransactionsTest(BitcoinTestFramework):
# whitelist peers to speed up tx relay / mempool sync
self.noban_tx_relay = True
self.supports_cli = False
self.uses_wallet = None
def setup_network(self):
super().setup_network()
@ -91,12 +89,8 @@ class RawTransactionsTest(BitcoinTestFramework):
self.sendrawtransaction_testmempoolaccept_tests()
self.decoderawtransaction_tests()
self.transaction_version_number_tests()
if self.is_specified_wallet_compiled() and not self.options.descriptors:
self.import_deterministic_coinbase_privkeys()
self.raw_multisig_transaction_legacy_tests()
self.getrawtransaction_verbosity_tests()
def getrawtransaction_tests(self):
tx = self.wallet.send_self_transfer(from_node=self.nodes[0])
self.generate(self.nodes[0], 1)
@ -493,125 +487,5 @@ class RawTransactionsTest(BitcoinTestFramework):
decrawtx = self.nodes[0].decoderawtransaction(rawtx)
assert_equal(decrawtx['version'], 0xffffffff)
def raw_multisig_transaction_legacy_tests(self):
self.log.info("Test raw multisig transactions (legacy)")
# The traditional multisig workflow does not work with descriptor wallets so these are legacy only.
# The multisig workflow with descriptor wallets uses PSBTs and is tested elsewhere, no need to do them here.
# 2of2 test
addr1 = self.nodes[2].getnewaddress()
addr2 = self.nodes[2].getnewaddress()
addr1Obj = self.nodes[2].getaddressinfo(addr1)
addr2Obj = self.nodes[2].getaddressinfo(addr2)
# Tests for createmultisig and addmultisigaddress
assert_raises_rpc_error(-5, 'Pubkey "01020304" must have a length of either 33 or 65 bytes', self.nodes[0].createmultisig, 1, ["01020304"])
# createmultisig can only take public keys
self.nodes[0].createmultisig(2, [addr1Obj['pubkey'], addr2Obj['pubkey']])
# addmultisigaddress can take both pubkeys and addresses so long as they are in the wallet, which is tested here
assert_raises_rpc_error(-5, f'Pubkey "{addr1}" must be a hex string', self.nodes[0].createmultisig, 2, [addr1Obj['pubkey'], addr1])
mSigObj = self.nodes[2].addmultisigaddress(2, [addr1Obj['pubkey'], addr1])['address']
# use balance deltas instead of absolute values
bal = self.nodes[2].getbalance()
# send 1.2 BTC to msig adr
txId = self.nodes[0].sendtoaddress(mSigObj, 1.2)
self.sync_all()
self.generate(self.nodes[0], 1)
# node2 has both keys of the 2of2 ms addr, tx should affect the balance
assert_equal(self.nodes[2].getbalance(), bal + Decimal('1.20000000'))
# 2of3 test from different nodes
bal = self.nodes[2].getbalance()
addr1 = self.nodes[1].getnewaddress()
addr2 = self.nodes[2].getnewaddress()
addr3 = self.nodes[2].getnewaddress()
addr1Obj = self.nodes[1].getaddressinfo(addr1)
addr2Obj = self.nodes[2].getaddressinfo(addr2)
addr3Obj = self.nodes[2].getaddressinfo(addr3)
mSigObj = self.nodes[2].addmultisigaddress(2, [addr1Obj['pubkey'], addr2Obj['pubkey'], addr3Obj['pubkey']])['address']
txId = self.nodes[0].sendtoaddress(mSigObj, 2.2)
decTx = self.nodes[0].gettransaction(txId)
rawTx = self.nodes[0].decoderawtransaction(decTx['hex'])
self.sync_all()
self.generate(self.nodes[0], 1)
# THIS IS AN INCOMPLETE FEATURE
# NODE2 HAS TWO OF THREE KEYS AND THE FUNDS SHOULD BE SPENDABLE AND COUNT AT BALANCE CALCULATION
assert_equal(self.nodes[2].getbalance(), bal) # for now, assume the funds of a 2of3 multisig tx are not marked as spendable
txDetails = self.nodes[0].gettransaction(txId, True)
rawTx = self.nodes[0].decoderawtransaction(txDetails['hex'])
vout = next(o for o in rawTx['vout'] if o['value'] == Decimal('2.20000000'))
bal = self.nodes[0].getbalance()
inputs = [{"txid": txId, "vout": vout['n'], "scriptPubKey": vout['scriptPubKey']['hex'], "amount": vout['value']}]
outputs = {self.nodes[0].getnewaddress(): 2.19}
rawTx = self.nodes[2].createrawtransaction(inputs, outputs)
rawTxPartialSigned = self.nodes[1].signrawtransactionwithwallet(rawTx, inputs)
assert_equal(rawTxPartialSigned['complete'], False) # node1 only has one key, can't comp. sign the tx
rawTxSigned = self.nodes[2].signrawtransactionwithwallet(rawTx, inputs)
assert_equal(rawTxSigned['complete'], True) # node2 can sign the tx compl., own two of three keys
self.nodes[2].sendrawtransaction(rawTxSigned['hex'])
rawTx = self.nodes[0].decoderawtransaction(rawTxSigned['hex'])
self.sync_all()
self.generate(self.nodes[0], 1)
assert_equal(self.nodes[0].getbalance(), bal + Decimal('50.00000000') + Decimal('2.19000000')) # block reward + tx
# 2of2 test for combining transactions
bal = self.nodes[2].getbalance()
addr1 = self.nodes[1].getnewaddress()
addr2 = self.nodes[2].getnewaddress()
addr1Obj = self.nodes[1].getaddressinfo(addr1)
addr2Obj = self.nodes[2].getaddressinfo(addr2)
self.nodes[1].addmultisigaddress(2, [addr1Obj['pubkey'], addr2Obj['pubkey']])['address']
mSigObj = self.nodes[2].addmultisigaddress(2, [addr1Obj['pubkey'], addr2Obj['pubkey']])['address']
mSigObjValid = self.nodes[2].getaddressinfo(mSigObj)
txId = self.nodes[0].sendtoaddress(mSigObj, 2.2)
decTx = self.nodes[0].gettransaction(txId)
rawTx2 = self.nodes[0].decoderawtransaction(decTx['hex'])
self.sync_all()
self.generate(self.nodes[0], 1)
assert_equal(self.nodes[2].getbalance(), bal) # the funds of a 2of2 multisig tx should not be marked as spendable
txDetails = self.nodes[0].gettransaction(txId, True)
rawTx2 = self.nodes[0].decoderawtransaction(txDetails['hex'])
vout = next(o for o in rawTx2['vout'] if o['value'] == Decimal('2.20000000'))
bal = self.nodes[0].getbalance()
inputs = [{"txid": txId, "vout": vout['n'], "scriptPubKey": vout['scriptPubKey']['hex'], "redeemScript": mSigObjValid['hex'], "amount": vout['value']}]
outputs = {self.nodes[0].getnewaddress(): 2.19}
rawTx2 = self.nodes[2].createrawtransaction(inputs, outputs)
rawTxPartialSigned1 = self.nodes[1].signrawtransactionwithwallet(rawTx2, inputs)
self.log.debug(rawTxPartialSigned1)
assert_equal(rawTxPartialSigned1['complete'], False) # node1 only has one key, can't comp. sign the tx
rawTxPartialSigned2 = self.nodes[2].signrawtransactionwithwallet(rawTx2, inputs)
self.log.debug(rawTxPartialSigned2)
assert_equal(rawTxPartialSigned2['complete'], False) # node2 only has one key, can't comp. sign the tx
assert_raises_rpc_error(-22, "TX decode failed", self.nodes[0].combinerawtransaction, [rawTxPartialSigned1['hex'], rawTxPartialSigned2['hex'] + "00"])
assert_raises_rpc_error(-22, "Missing transactions", self.nodes[0].combinerawtransaction, [])
rawTxComb = self.nodes[2].combinerawtransaction([rawTxPartialSigned1['hex'], rawTxPartialSigned2['hex']])
self.log.debug(rawTxComb)
self.nodes[2].sendrawtransaction(rawTxComb)
rawTx2 = self.nodes[0].decoderawtransaction(rawTxComb)
self.sync_all()
self.generate(self.nodes[0], 1)
assert_equal(self.nodes[0].getbalance(), bal + Decimal('50.00000000') + Decimal('2.19000000')) # block reward + tx
assert_raises_rpc_error(-25, "Input not found or already spent", self.nodes[0].combinerawtransaction, [rawTxPartialSigned1['hex'], rawTxPartialSigned2['hex']])
if __name__ == '__main__':
RawTransactionsTest(__file__).main()

View file

@ -42,6 +42,7 @@ COIN = 100000000 # 1 btc in satoshis
MAX_MONEY = 21000000 * COIN
MAX_BIP125_RBF_SEQUENCE = 0xfffffffd # Sequence number that is rbf-opt-in (BIP 125) and csv-opt-out (BIP 68)
MAX_SEQUENCE_NONFINAL = 0xfffffffe # Sequence number that is csv-opt-out (BIP 68)
SEQUENCE_FINAL = 0xffffffff # Sequence number that disables nLockTime if set for every input of a tx
MAX_PROTOCOL_MESSAGE_LENGTH = 4000000 # Maximum length of incoming protocol messages

Some files were not shown because too many files have changed in this diff Show more