Skip to content

Commit

Permalink
Apply suggestions from code review
Browse files Browse the repository at this point in the history
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
  • Loading branch information
BenjaminBossan and younesbelkada authored Dec 4, 2023
1 parent 6855fe4 commit 6d9a037
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion src/peft/peft_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ def save_pretrained(
exist).
safe_serialization (`bool`, *optional*):
Whether to save the adapter files in safetensors format, defaults to `True`.
selected_adapters (`list` of `str`, *optional*):
selected_adapters (`List[str]`, *optional*):
A list of adapters to be saved. If `None`, will default to all adapters.
save_embedding_layers (`Union[bool, str]`, *optional*, defaults to `"auto"`):
If `True`, save the embedding layers in addition to adapter weights. If `auto`, checks the common
Expand Down
2 changes: 1 addition & 1 deletion src/peft/tuners/tuners_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -403,7 +403,7 @@ def set_adapter(self, adapter_names: str | list[str]) -> None:
"""Set the active adapter(s).
Args:
adapter_name (`str` or `list[str]`): Name of the adapter(s) to be activated.
adapter_name (`str` or `List[str]`): Name of the adapter(s) to be activated.
"""
if isinstance(adapter_names, str):
adapter_names = [adapter_names]
Expand Down

0 comments on commit 6d9a037

Please sign in to comment.