Insert empty line between suite and alternative branch after def/class by konstin · Pull Request #12294 · astral-sh/ruff (original) (raw)

ruff-ecosystem results

Formatter (stable)

ℹ️ ecosystem check encountered format errors. (no format changes; 1 project error)

openai/openai-cookbook (error)

warning: Detected debug build without --no-cache.
error: Failed to parse examples/chatgpt/gpt_actions_library/.gpt_action_getting_started.ipynb:11:1:1: Expected an expression
error: Failed to parse examples/chatgpt/gpt_actions_library/gpt_action_bigquery.ipynb:13:1:1: Expected an expression

Formatter (preview)

ℹ️ ecosystem check detected format changes. (+15 -0 lines in 14 files in 5 projects; 1 project error; 48 projects unchanged)

bokeh/bokeh (+5 -0 lines across 5 files)

ruff format --preview

setup.py~L42

 def yellow(text: str) -> str:
     return f"{colorama.Fore.YELLOW}{text}{colorama.Style.RESET_ALL}"

src/bokeh/command/subcommands/file_output.py~L193

             def indexed(i: int) -> str:
                 return filename

src/bokeh/core/has_props.py~L49

 F = TypeVar("F", bound=Callable[..., Any])

 def lru_cache(arg: int | None) -> Callable[[F], F]: ...

src/bokeh/document/locking.py~L93

     @wraps(func)
     async def _wrapper(*args: Any, **kw: Any) -> None:
         await func(*args, **kw)

tests/unit/bokeh/core/test_has_props.py~L556

     class DupeProps(hp.HasProps):
         bar = AngleSpec()
         bar_units = String()

langchain-ai/langchain (+2 -0 lines across 2 files)

ruff format --preview

libs/community/tests/unit_tests/chat_message_histories/test_sql.py~L10

 class Base(DeclarativeBase):
     pass

libs/core/langchain_core/messages/utils.py~L765

     def list_token_counter(messages: Sequence[BaseMessage]) -> int:
         return sum(token_counter(msg) for msg in messages)  # type: ignore[arg-type, misc]

python/typeshed (+4 -0 lines across 3 files)

ruff format --preview

stdlib/dbm/gnu.pyi~L44

 if sys.version_info >= (3, 11):
     def open(filename: StrOrBytesPath, flags: str = "r", mode: int = 0o666, /) -> _gdbm: ...

stdlib/inspect.pyi~L334

         locals: Mapping[str, Any] | None = None,
         eval_str: bool = False,
     ) -> Self: ...

stdlib/ipaddress.pyi~L46

     def __ge__(self, other: Self) -> bool: ...
     def __gt__(self, other: Self) -> bool: ...
     def __le__(self, other: Self) -> bool: ...

stdlib/ipaddress.pyi~L84

     def __ge__(self, other: Self) -> bool: ...
     def __gt__(self, other: Self) -> bool: ...
     def __le__(self, other: Self) -> bool: ...

indico/indico (+3 -0 lines across 3 files)

ruff format --preview

indico/modules/events/contributions/util.py~L110

         if not c.speakers:
             return True, None
         return False, speakers[0].get_full_name(last_name_upper=False, abbrev_first_name=False).lower()

indico/web/flask/templating.py~L63

         if isinstance(item, str):
             item = item.lower()
         return natural_sort_key(item)

indico/web/flask/util.py~L78

         # Indico RH
         def wrapper(**kwargs):
             return obj().process()

mesonbuild/meson-python (+1 -0 lines across 1 file)

ruff format --preview

mesonpy/_compat.py~L29

 def read_binary(package: str, resource: str) -> bytes:
     return importlib.resources.files(package).joinpath(resource).read_bytes()

openai/openai-cookbook (error)

ruff format --preview

warning: Detected debug build without --no-cache.
error: Failed to parse examples/chatgpt/gpt_actions_library/.gpt_action_getting_started.ipynb:11:1:1: Expected an expression
error: Failed to parse examples/chatgpt/gpt_actions_library/gpt_action_bigquery.ipynb:13:1:1: Expected an expression