TokenStream in proc_macro - Rust (original) (raw)
Struct TokenStream
1.15.0 · Source
pub struct TokenStream(/* private fields */);
Expand description
The main type provided by this crate, representing an abstract stream of tokens, or, more specifically, a sequence of token trees. The type provides interfaces for iterating over those token trees and, conversely, collecting a number of token trees into one stream.
This is both the input and output of #[proc_macro]
, #[proc_macro_attribute]
and #[proc_macro_derive]
definitions.
1.29.0 · Source
Returns an empty TokenStream
containing no token trees.
1.29.0 · Source
Checks if this TokenStream
is empty.
🔬This is a nightly-only experimental API. (proc_macro_expand
#90765)
Parses this TokenStream
as an expression and attempts to expand any macros within it. Returns the expanded TokenStream
.
Currently only expressions expanding to literals will succeed, although this may be relaxed in the future.
NOTE: In error conditions, expand_expr
may leave macros unexpanded, report an error, failing compilation, and/or return an Err(..)
. The specific behavior for any error condition, and what conditions are considered errors, is unspecified and may change in the future.
Prints token in a form convenient for debugging.
Prints the token stream as a string that is supposed to be losslessly convertible back into the same token stream (modulo spans), except for possibly TokenTree::Group
s with Delimiter::None
delimiters and negative numeric literals.
Note: the exact form of the output is subject to change, e.g. there might be changes in the whitespace used between tokens. Therefore, you should_not_ do any kind of simple substring matching on the output string (as produced by to_string
) to implement a proc macro, because that matching might stop working if such changes happen. Instead, you should work at theTokenTree
level, e.g. matching against TokenTree::Ident
,TokenTree::Punct
, or TokenTree::Literal
.
Extends a collection with the contents of an iterator. Read more
🔬This is a nightly-only experimental API. (extend_one
#72631)
Extends a collection with exactly one element.
🔬This is a nightly-only experimental API. (extend_one
#72631)
Reserves capacity in a collection for the given number of additional elements. Read more
Extends a collection with the contents of an iterator. Read more
🔬This is a nightly-only experimental API. (extend_one
#72631)
Extends a collection with exactly one element.
🔬This is a nightly-only experimental API. (extend_one
#72631)
Reserves capacity in a collection for the given number of additional elements. Read more
Creates a token stream containing a single token tree.
Converts to this type from the input type.
A “flattening” operation on token streams, collects token trees from multiple token streams into a single stream.
Collects a number of token trees into a single stream.
Attempts to break the string into tokens and parse those tokens into a token stream. May fail for a number of reasons, for example, if the string contains unbalanced delimiters or characters not existing in the language. All tokens in the parsed stream get Span::call_site()
spans.
NOTE: some errors may cause panics instead of returning LexError
. We reserve the right to change these errors into LexError
s later.
The associated error which can be returned from parsing.
Parses a string s
to return a value of this type. Read more
The type of the elements being iterated over.
Which kind of iterator are we turning this into?
Creates an iterator from a value. Read more
🔬This is a nightly-only experimental API. (proc_macro_totokens
#130977)
Write self
to the given TokenStream
. Read more
🔬This is a nightly-only experimental API. (proc_macro_totokens
#130977)
Convert self
directly into a TokenStream
object. Read more
🔬This is a nightly-only experimental API. (proc_macro_totokens
#130977)
Convert self
directly into a TokenStream
object. Read more