Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 26 additions & 1 deletion docs/source/layers.md
Original file line number Diff line number Diff line change
Expand Up @@ -348,7 +348,7 @@ a kernel to a range of ROCm capabilities.

### Loading from a local repository for testing

The `LocalLayerRepository` class is provided to load a repository from
The `LocalLayerRepository` class is provided to load a layer repository from
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can have repositories that contain layers, and functions at the same time, so there is not such thing as layer repository and function repository

a local directory. For example:

```python
Expand All @@ -366,3 +366,28 @@ with use_kernel_mapping(
):
kernelize(linear, mode=Mode.INFERENCE)
```

Similarly, `LocalFuncRepository` can be used to load a function repository from
a local directory for testing. This is useful when developing or testing kernel
functions locally before publishing them to the Hub:

```python
with use_kernel_mapping(
{
"silu_and_mul": {
"cuda": LocalFuncRepository(
repo_path="/home/daniel/kernels/activation",
package_name="activation",
func_name="silu_and_mul",
)
}
},
inherit_mapping=False,
):
kernelize(model, mode=Mode.INFERENCE)
```

The parameters are:
- `repo_path`: Path to the local kernel repository directory
- `package_name`: The Python package name of the kernel
- `func_name`: The name of the function within the kernel repository
Comment on lines +389 to +393
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we don't need to add this, it's already very explicit within the example, wdyt ?

22 changes: 22 additions & 0 deletions docs/source/locking.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,28 @@ kernel_layer_mapping = {
register_kernel_mapping(kernel_layer_mapping)
```

### Locked kernel functions

Locked functions can also be used with `LockedFuncRepository`. This is useful for stateless
layers or functions that were converted with `use_kernel_func_from_hub`:
Comment on lines +49 to +50
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure this is accurate since all layers are stateless, maybe we can just say

Suggested change
Locked functions can also be used with `LockedFuncRepository`. This is useful for stateless
layers or functions that were converted with `use_kernel_func_from_hub`:
Locking is also supported for functions, just like for layers, through the use of `LockedFuncRepository`:


```python
from kernels import LockedFuncRepository, use_kernel_mapping

kernel_func_mapping = {
"silu_and_mul": {
"cuda": LockedFuncRepository(
repo_id="kernels-community/activation",
func_name="silu_and_mul",
)
}
}

with use_kernel_mapping(kernel_func_mapping):
# Your model code here
model = kernelize(model, device="cuda", mode=Mode.INFERENCE)
```

## Pre-downloading locked kernels

Locked kernels can be pre-downloaded by running `kernels download .` in your
Expand Down