Home
last modified time | relevance | path

Searched refs:inference (Results 1 – 7 of 7) sorted by relevance

/linux-6.15/net/ipv4/
H A Dtcp_lp.c87 u32 inference; member
110 lp->inference = 0; in tcp_lp_init()
284 lp->inference = 3 * delta; in tcp_lp_pkts_acked()
287 if (lp->last_drop && (now - lp->last_drop < lp->inference)) in tcp_lp_pkts_acked()
/linux-6.15/drivers/accel/qaic/
H A DKconfig15 designed to accelerate Deep Learning inference workloads.
/linux-6.15/drivers/accel/habanalabs/
H A DKconfig18 designed to accelerate Deep Learning inference and training workloads.
/linux-6.15/drivers/accel/ivpu/
H A DKconfig15 is a CPU-integrated inference accelerator for Computer Vision
/linux-6.15/Documentation/accel/
H A Dintroduction.rst19 - Edge AI - doing inference at an edge device. It can be an embedded ASIC/FPGA,
/linux-6.15/Documentation/accel/amdxdna/
H A Damdnpu.rst15 AMD NPU (Neural Processing Unit) is a multi-user AI inference accelerator
/linux-6.15/Documentation/accel/qaic/
H A Daic100.rst13 inference workloads. They are AI accelerators.