History log of /linux-6.15/include/crypto/internal/hash.h (Results 1 – 25 of 61)
Revision (<<< Hide revision tags) (Show revision tags >>>) Date Author Comments
Revision tags: v6.15, v6.15-rc7, v6.15-rc6, v6.15-rc5, v6.15-rc4, v6.15-rc3, v6.15-rc2
# b2e689ba 11-Apr-2025 Herbert Xu <[email protected]>

crypto: ahash - Disable request chaining

Disable hash request chaining in case a driver that copies an
ahash_request object by hand accidentally triggers chaining.

Reported-by: Manorit Chawdhry <m-

crypto: ahash - Disable request chaining

Disable hash request chaining in case a driver that copies an
ahash_request object by hand accidentally triggers chaining.

Reported-by: Manorit Chawdhry <[email protected]>
Fixes: f2ffe5a9183d ("crypto: hash - Add request chaining API")
Signed-off-by: Herbert Xu <[email protected]>
Tested-by: Manorit Chawdhry <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.15-rc1, v6.14, v6.14-rc7, v6.14-rc6, v6.14-rc5, v6.14-rc4, v6.14-rc3
# 439963cd 16-Feb-2025 Herbert Xu <[email protected]>

crypto: ahash - Add virtual address support

This patch adds virtual address support to ahash. Virtual addresses
were previously only supported through shash. The user may choose
to use virtual add

crypto: ahash - Add virtual address support

This patch adds virtual address support to ahash. Virtual addresses
were previously only supported through shash. The user may choose
to use virtual addresses with ahash by calling ahash_request_set_virt
instead of ahash_request_set_crypt.

The API will take care of translating this to an SG list if necessary,
unless the algorithm declares that it supports chaining. Therefore
in order for an ahash algorithm to support chaining, it must also
support virtual addresses directly.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


# f2ffe5a9 16-Feb-2025 Herbert Xu <[email protected]>

crypto: hash - Add request chaining API

This adds request chaining to the ahash interface. Request chaining
allows multiple requests to be submitted in one shot. An algorithm
can elect to receive

crypto: hash - Add request chaining API

This adds request chaining to the ahash interface. Request chaining
allows multiple requests to be submitted in one shot. An algorithm
can elect to receive chained requests by setting the flag
CRYPTO_ALG_REQ_CHAIN. If this bit is not set, the API will break
up chained requests and submit them one-by-one.

A new err field is added to struct crypto_async_request to record
the return value for each individual request.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.14-rc2, v6.14-rc1, v6.13, v6.13-rc7, v6.13-rc6, v6.13-rc5
# 7fa48173 27-Dec-2024 Eric Biggers <[email protected]>

crypto: ahash - make hash walk functions private to ahash.c

Due to the removal of the Niagara2 SPU driver, crypto_hash_walk_first(),
crypto_hash_walk_done(), crypto_hash_walk_last(), and struct
cryp

crypto: ahash - make hash walk functions private to ahash.c

Due to the removal of the Niagara2 SPU driver, crypto_hash_walk_first(),
crypto_hash_walk_done(), crypto_hash_walk_last(), and struct
crypto_hash_walk are now only used in crypto/ahash.c. Therefore, make
them all private to crypto/ahash.c. I.e. un-export the two functions
that were exported, make the functions static, and move the struct
definition to the .c file. As part of this, move the functions to
earlier in the file to avoid needing to add forward declarations.

Signed-off-by: Eric Biggers <[email protected]>
Acked-by: Ard Biesheuvel <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.13-rc4, v6.13-rc3, v6.13-rc2, v6.13-rc1, v6.12, v6.12-rc7, v6.12-rc6, v6.12-rc5, v6.12-rc4, v6.12-rc3, v6.12-rc2, v6.12-rc1, v6.11, v6.11-rc7, v6.11-rc6, v6.11-rc5, v6.11-rc4, v6.11-rc3, v6.11-rc2, v6.11-rc1, v6.10, v6.10-rc7, v6.10-rc6, v6.10-rc5, v6.10-rc4, v6.10-rc3, v6.10-rc2, v6.10-rc1, v6.9, v6.9-rc7, v6.9-rc6, v6.9-rc5, v6.9-rc4, v6.9-rc3, v6.9-rc2, v6.9-rc1, v6.8, v6.8-rc7, v6.8-rc6, v6.8-rc5, v6.8-rc4, v6.8-rc3, v6.8-rc2
# 9a14b311 27-Jan-2024 Eric Biggers <[email protected]>

crypto: ahash - unexport crypto_hash_alg_has_setkey()

Since crypto_hash_alg_has_setkey() is only called from ahash.c itself,
make it a static function.

Signed-off-by: Eric Biggers <ebiggers@google.

crypto: ahash - unexport crypto_hash_alg_has_setkey()

Since crypto_hash_alg_has_setkey() is only called from ahash.c itself,
make it a static function.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.8-rc1, v6.7, v6.7-rc8, v6.7-rc7, v6.7-rc6, v6.7-rc5, v6.7-rc4, v6.7-rc3, v6.7-rc2, v6.7-rc1, v6.6, v6.6-rc7
# c626910f 22-Oct-2023 Eric Biggers <[email protected]>

crypto: ahash - remove support for nonzero alignmask

Currently, the ahash API checks the alignment of all key and result
buffers against the algorithm's declared alignmask, and for any
unaligned buf

crypto: ahash - remove support for nonzero alignmask

Currently, the ahash API checks the alignment of all key and result
buffers against the algorithm's declared alignmask, and for any
unaligned buffers it falls back to manually aligned temporary buffers.

This is virtually useless, however. First, since it does not apply to
the message, its effect is much more limited than e.g. is the case for
the alignmask for "skcipher". Second, the key and result buffers are
given as virtual addresses and cannot (in general) be DMA'ed into, so
drivers end up having to copy to/from them in software anyway. As a
result it's easy to use memcpy() or the unaligned access helpers.

The crypto_hash_walk_*() helper functions do use the alignmask to align
the message. But with one exception those are only used for shash
algorithms being exposed via the ahash API, not for native ahashes, and
aligning the message is not required in this case, especially now that
alignmask support has been removed from shash. The exception is the
n2_core driver, which doesn't set an alignmask.

In any case, no ahash algorithms actually set a nonzero alignmask
anymore. Therefore, remove support for it from ahash. The benefit is
that all the code to handle "misaligned" buffers in the ahash API goes
away, reducing the overhead of the ahash API.

This follows the same change that was made to shash.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# acd77995 22-Oct-2023 Eric Biggers <[email protected]>

crypto: shash - remove crypto_shash_ctx_aligned()

crypto_shash_ctx_aligned() is no longer used, and it is useless now that
shash algorithms don't support nonzero alignmasks, so remove it.

Also remo

crypto: shash - remove crypto_shash_ctx_aligned()

crypto_shash_ctx_aligned() is no longer used, and it is useless now that
shash algorithms don't support nonzero alignmasks, so remove it.

Also remove crypto_tfm_ctx_aligned() which was only called by
crypto_shash_ctx_aligned(). It's unlikely to be useful again, since it
seems inappropriate to use cra_alignmask to represent alignment for the
tfm context when it already means alignment for inputs/outputs.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.6-rc6, v6.6-rc5, v6.6-rc4, v6.6-rc3, v6.6-rc2, v6.6-rc1, v6.5, v6.5-rc7, v6.5-rc6, v6.5-rc5, v6.5-rc4, v6.5-rc3, v6.5-rc2, v6.5-rc1, v6.4, v6.4-rc7, v6.4-rc6, v6.4-rc5, v6.4-rc4, v6.4-rc3, v6.4-rc2, v6.4-rc1, v6.3
# 3908edf8 20-Apr-2023 Herbert Xu <[email protected]>

crypto: hash - Make crypto_ahash_alg helper available

Move the crypto_ahash_alg helper into include/crypto/internal so
that drivers can use it.

Signed-off-by: Herbert Xu <[email protected]

crypto: hash - Make crypto_ahash_alg helper available

Move the crypto_ahash_alg helper into include/crypto/internal so
that drivers can use it.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


# c7535fb2 20-Apr-2023 Herbert Xu <[email protected]>

crypto: hash - Add statesize to crypto_ahash

As ahash drivers may need to use fallbacks, their state size
is thus variable. Deal with this by making it an attribute
of crypto_ahash.

Signed-off-by:

crypto: hash - Add statesize to crypto_ahash

As ahash drivers may need to use fallbacks, their state size
is thus variable. Deal with this by making it an attribute
of crypto_ahash.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.3-rc7
# ed3630b8 13-Apr-2023 Herbert Xu <[email protected]>

crypto: hash - Add crypto_clone_ahash/shash

This patch adds the helpers crypto_clone_ahash and crypto_clone_shash.
They are the hash-specific counterparts of crypto_clone_tfm.

This allows code path

crypto: hash - Add crypto_clone_ahash/shash

This patch adds the helpers crypto_clone_ahash and crypto_clone_shash.
They are the hash-specific counterparts of crypto_clone_tfm.

This allows code paths that cannot otherwise allocate a hash tfm
object to do so. Once a new tfm has been obtained its key could
then be changed without impacting other users.

Note that only algorithms that implement clone_tfm can be cloned.
However, all keyless hashes can be cloned by simply reusing the
tfm object.

Signed-off-by: Herbert Xu <[email protected]>
Reviewed-by: Simon Horman <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.3-rc6, v6.3-rc5, v6.3-rc4, v6.3-rc3, v6.3-rc2, v6.3-rc1, v6.2, v6.2-rc8
# d9588045 10-Feb-2023 Herbert Xu <[email protected]>

crypto: hash - Use crypto_request_complete

Use the crypto_request_complete helper instead of calling the
completion function directly.

This patch also removes the voodoo programming previously used

crypto: hash - Use crypto_request_complete

Use the crypto_request_complete helper instead of calling the
completion function directly.

This patch also removes the voodoo programming previously used
for unaligned ahash operations and replaces it with a sub-request.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.2-rc7, v6.2-rc6, v6.2-rc5, v6.2-rc4, v6.2-rc3, v6.2-rc2, v6.2-rc1, v6.1, v6.1-rc8, v6.1-rc7
# b5f755fb 25-Nov-2022 Herbert Xu <[email protected]>

crypto: hash - Add ctx helpers with DMA alignment

This patch adds helpers to access the ahash context structure and
request context structure with an added alignment for DMA access.

Signed-off-by:

crypto: hash - Add ctx helpers with DMA alignment

This patch adds helpers to access the ahash context structure and
request context structure with an added alignment for DMA access.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.1-rc6
# c060e16d 18-Nov-2022 Eric Biggers <[email protected]>

Revert "crypto: shash - avoid comparing pointers to exported functions under CFI"

This reverts commit 22ca9f4aaf431a9413dcc115dd590123307f274f because CFI
no longer breaks cross-module function addr

Revert "crypto: shash - avoid comparing pointers to exported functions under CFI"

This reverts commit 22ca9f4aaf431a9413dcc115dd590123307f274f because CFI
no longer breaks cross-module function address equality, so
crypto_shash_alg_has_setkey() can now be an inline function like before.

This commit should not be backported to kernels that don't have the new
CFI implementation.

Acked-by: Peter Zijlstra (Intel) <[email protected]>
Reviewed-by: Sami Tolvanen <[email protected]>
Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v6.1-rc5, v6.1-rc4, v6.1-rc3, v6.1-rc2, v6.1-rc1, v6.0, v6.0-rc7, v6.0-rc6, v6.0-rc5, v6.0-rc4, v6.0-rc3, v6.0-rc2, v6.0-rc1, v5.19, v5.19-rc8, v5.19-rc7, v5.19-rc6, v5.19-rc5, v5.19-rc4, v5.19-rc3, v5.19-rc2, v5.19-rc1, v5.18, v5.18-rc7, v5.18-rc6, v5.18-rc5, v5.18-rc4, v5.18-rc3, v5.18-rc2, v5.18-rc1, v5.17, v5.17-rc8, v5.17-rc7, v5.17-rc6, v5.17-rc5, v5.17-rc4, v5.17-rc3, v5.17-rc2, v5.17-rc1, v5.16, v5.16-rc8, v5.16-rc7, v5.16-rc6, v5.16-rc5, v5.16-rc4, v5.16-rc3, v5.16-rc2, v5.16-rc1, v5.15, v5.15-rc7, v5.15-rc6, v5.15-rc5, v5.15-rc4, v5.15-rc3, v5.15-rc2, v5.15-rc1, v5.14, v5.14-rc7, v5.14-rc6, v5.14-rc5, v5.14-rc4, v5.14-rc3, v5.14-rc2, v5.14-rc1, v5.13, v5.13-rc7, v5.13-rc6
# 22ca9f4a 10-Jun-2021 Ard Biesheuvel <[email protected]>

crypto: shash - avoid comparing pointers to exported functions under CFI

crypto_shash_alg_has_setkey() is implemented by testing whether the
.setkey() member of a struct shash_alg points to the defa

crypto: shash - avoid comparing pointers to exported functions under CFI

crypto_shash_alg_has_setkey() is implemented by testing whether the
.setkey() member of a struct shash_alg points to the default version,
called shash_no_setkey(). As crypto_shash_alg_has_setkey() is a static
inline, this requires shash_no_setkey() to be exported to modules.

Unfortunately, when building with CFI, function pointers are routed
via CFI stubs which are private to each module (or to the kernel proper)
and so this function pointer comparison may fail spuriously.

Let's fix this by turning crypto_shash_alg_has_setkey() into an out of
line function.

Cc: Sami Tolvanen <[email protected]>
Cc: Eric Biggers <[email protected]>
Signed-off-by: Ard Biesheuvel <[email protected]>
Reviewed-by: Eric Biggers <[email protected]>
Reviewed-by: Sami Tolvanen <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v5.13-rc5, v5.13-rc4, v5.13-rc3, v5.13-rc2, v5.13-rc1, v5.12, v5.12-rc8, v5.12-rc7, v5.12-rc6, v5.12-rc5, v5.12-rc4, v5.12-rc3, v5.12-rc2, v5.12-rc1, v5.12-rc1-dontuse, v5.11, v5.11-rc7, v5.11-rc6, v5.11-rc5, v5.11-rc4, v5.11-rc3, v5.11-rc2, v5.11-rc1, v5.10, v5.10-rc7, v5.10-rc6, v5.10-rc5, v5.10-rc4, v5.10-rc3, v5.10-rc2, v5.10-rc1, v5.9, v5.9-rc8, v5.9-rc7, v5.9-rc6, v5.9-rc5, v5.9-rc4, v5.9-rc3, v5.9-rc2
# b00ba76a 18-Aug-2020 Herbert Xu <[email protected]>

crypto: ahash - Add ahash_alg_instance

This patch adds the helper ahash_alg_instance which is used to
convert a crypto_ahash object into its corresponding ahash_instance.

Signed-off-by: Herbert Xu

crypto: ahash - Add ahash_alg_instance

This patch adds the helper ahash_alg_instance which is used to
convert a crypto_ahash object into its corresponding ahash_instance.

Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v5.9-rc1
# 8afa25aa 11-Aug-2020 Ira Weiny <[email protected]>

crypto: hash - Remove unused async iterators

Revert "crypto: hash - Add real ahash walk interface"
This reverts commit 75ecb231ff45b54afa9f4ec9137965c3c00868f4.

The callers of the functions in this

crypto: hash - Remove unused async iterators

Revert "crypto: hash - Add real ahash walk interface"
This reverts commit 75ecb231ff45b54afa9f4ec9137965c3c00868f4.

The callers of the functions in this commit were removed in ab8085c130ed

Remove these unused calls.

Fixes: ab8085c130ed ("crypto: x86 - remove SHA multibuffer routines and mcryptd")
Cc: Ard Biesheuvel <[email protected]>
Signed-off-by: Ira Weiny <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v5.8, v5.8-rc7, v5.8-rc6, v5.8-rc5, v5.8-rc4, v5.8-rc3, v5.8-rc2, v5.8-rc1, v5.7, v5.7-rc7, v5.7-rc6, v5.7-rc5, v5.7-rc4, v5.7-rc3, v5.7-rc2, v5.7-rc1, v5.6, v5.6-rc7, v5.6-rc6, v5.6-rc5, v5.6-rc4, v5.6-rc3, v5.6-rc2, v5.6-rc1, v5.5, v5.5-rc7, v5.5-rc6, v5.5-rc5
# a39c66cc 03-Jan-2020 Eric Biggers <[email protected]>

crypto: shash - convert shash_free_instance() to new style

Convert shash_free_instance() and its users to the new way of freeing
instances, where a ->free() method is installed to the instance struc

crypto: shash - convert shash_free_instance() to new style

Convert shash_free_instance() and its users to the new way of freeing
instances, where a ->free() method is installed to the instance struct
itself. This replaces the weakly-typed method crypto_template::free().

This will allow removing support for the old way of freeing instances.

Also give shash_free_instance() a more descriptive name to reflect that
it's only for instances with a single spawn, not for any instance.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 48fb3e57 03-Jan-2020 Eric Biggers <[email protected]>

crypto: hash - add support for new way of freeing instances

Add support to shash and ahash for the new way of freeing instances
(already used for skcipher, aead, and akcipher) where a ->free() metho

crypto: hash - add support for new way of freeing instances

Add support to shash and ahash for the new way of freeing instances
(already used for skcipher, aead, and akcipher) where a ->free() method
is installed to the instance struct itself. These methods are more
strongly-typed than crypto_template::free(), which they replace.

This will allow removing support for the old way of freeing instances.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 6d1b41fc 03-Jan-2020 Eric Biggers <[email protected]>

crypto: ahash - unexport crypto_ahash_type

Now that all the templates that need ahash spawns have been converted to
use crypto_grab_ahash() rather than look up the algorithm directly,
crypto_ahash_t

crypto: ahash - unexport crypto_ahash_type

Now that all the templates that need ahash spawns have been converted to
use crypto_grab_ahash() rather than look up the algorithm directly,
crypto_ahash_type is no longer used outside of ahash.c. Make it static.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 629f1afc 03-Jan-2020 Eric Biggers <[email protected]>

crypto: algapi - remove obsoleted instance creation helpers

Remove lots of helper functions that were previously used for
instantiating crypto templates, but are now unused:

- crypto_get_attr_alg()

crypto: algapi - remove obsoleted instance creation helpers

Remove lots of helper functions that were previously used for
instantiating crypto templates, but are now unused:

- crypto_get_attr_alg() and similar functions looked up an inner
algorithm directly from a template parameter. These were replaced
with getting the algorithm's name, then calling crypto_grab_*().

- crypto_init_spawn2() and similar functions initialized a spawn, given
an algorithm. Similarly, these were replaced with crypto_grab_*().

- crypto_alloc_instance() and similar functions allocated an instance
with a single spawn, given the inner algorithm. These aren't useful
anymore since crypto_grab_*() need the instance allocated first.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 84a9c938 03-Jan-2020 Eric Biggers <[email protected]>

crypto: ahash - introduce crypto_grab_ahash()

Currently, ahash spawns are initialized by using ahash_attr_alg() or
crypto_find_alg() to look up the ahash algorithm, then calling
crypto_init_ahash_sp

crypto: ahash - introduce crypto_grab_ahash()

Currently, ahash spawns are initialized by using ahash_attr_alg() or
crypto_find_alg() to look up the ahash algorithm, then calling
crypto_init_ahash_spawn().

This is different from how skcipher, aead, and akcipher spawns are
initialized (they use crypto_grab_*()), and for no good reason. This
difference introduces unnecessary complexity.

The crypto_grab_*() functions used to have some problems, like not
holding a reference to the algorithm and requiring the caller to
initialize spawn->base.inst. But those problems are fixed now.

So, let's introduce crypto_grab_ahash() so that we can convert all
templates to the same way of initializing their spawns.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# fdfad1ff 03-Jan-2020 Eric Biggers <[email protected]>

crypto: shash - introduce crypto_grab_shash()

Currently, shash spawns are initialized by using shash_attr_alg() or
crypto_alg_mod_lookup() to look up the shash algorithm, then calling
crypto_init_sh

crypto: shash - introduce crypto_grab_shash()

Currently, shash spawns are initialized by using shash_attr_alg() or
crypto_alg_mod_lookup() to look up the shash algorithm, then calling
crypto_init_shash_spawn().

This is different from how skcipher, aead, and akcipher spawns are
initialized (they use crypto_grab_*()), and for no good reason. This
difference introduces unnecessary complexity.

The crypto_grab_*() functions used to have some problems, like not
holding a reference to the algorithm and requiring the caller to
initialize spawn->base.inst. But those problems are fixed now.

So, let's introduce crypto_grab_shash() so that we can convert all
templates to the same way of initializing their spawns.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 77f7e94d 03-Jan-2020 Eric Biggers <[email protected]>

crypto: ahash - make struct ahash_instance be the full size

Define struct ahash_instance in a way analogous to struct
skcipher_instance, struct aead_instance, and struct akcipher_instance,
where the

crypto: ahash - make struct ahash_instance be the full size

Define struct ahash_instance in a way analogous to struct
skcipher_instance, struct aead_instance, and struct akcipher_instance,
where the struct is defined to include both the algorithm structure at
the beginning and the additional crypto_instance fields at the end.

This is needed to allow allocating ahash instances directly using
kzalloc(sizeof(*inst) + sizeof(*ictx), ...) in the same way as skcipher,
aead, and akcipher instances. In turn, that's needed to make spawns be
initialized in a consistent way everywhere.

Also take advantage of the addition of the base instance to struct
ahash_instance by simplifying the ahash_crypto_instance() and
ahash_instance() functions.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


# 1b84e7d0 03-Jan-2020 Eric Biggers <[email protected]>

crypto: shash - make struct shash_instance be the full size

Define struct shash_instance in a way analogous to struct
skcipher_instance, struct aead_instance, and struct akcipher_instance,
where the

crypto: shash - make struct shash_instance be the full size

Define struct shash_instance in a way analogous to struct
skcipher_instance, struct aead_instance, and struct akcipher_instance,
where the struct is defined to include both the algorithm structure at
the beginning and the additional crypto_instance fields at the end.

This is needed to allow allocating shash instances directly using
kzalloc(sizeof(*inst) + sizeof(*ictx), ...) in the same way as skcipher,
aead, and akcipher instances. In turn, that's needed to make spawns be
initialized in a consistent way everywhere.

Also take advantage of the addition of the base instance to struct
shash_instance by simplifying the shash_crypto_instance() and
shash_instance() functions.

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


Revision tags: v5.5-rc4, v5.5-rc3
# c6d633a9 15-Dec-2019 Eric Biggers <[email protected]>

crypto: algapi - make unregistration functions return void

Some of the algorithm unregistration functions return -ENOENT when asked
to unregister a non-registered algorithm, while others always retu

crypto: algapi - make unregistration functions return void

Some of the algorithm unregistration functions return -ENOENT when asked
to unregister a non-registered algorithm, while others always return 0
or always return void. But no users check the return value, except for
two of the bulk unregistration functions which print a message on error
but still always return 0 to their caller, and crypto_del_alg() which
calls crypto_unregister_instance() which always returns 0.

Since unregistering a non-registered algorithm is always a kernel bug
but there isn't anything callers should do to handle this situation at
runtime, let's simplify things by making all the unregistration
functions return void, and moving the error message into
crypto_unregister_alg() and upgrading it to a WARN().

Signed-off-by: Eric Biggers <[email protected]>
Signed-off-by: Herbert Xu <[email protected]>

show more ...


123