Module RDoc::TokenStream
In: tokenstream.rb
doc-tmp/rdoc/tokenstream.rb

A TokenStream is a list of tokens, gathered during the parse of some entity (say a method). Entities populate these streams by being registered with the lexer. Any class can collect tokens by including TokenStream. From the outside, you use such an object by calling the start_collecting_tokens method, followed by calls to add_token and pop_token.

Methods

Public Instance methods

[Source]

    # File doc-tmp/rdoc/tokenstream.rb, line 20
20:   def add_token(tk)
21:     @token_stream << tk
22:   end

[Source]

    # File tokenstream.rb, line 20
20:   def add_token(tk)
21:     @token_stream << tk
22:   end

[Source]

    # File doc-tmp/rdoc/tokenstream.rb, line 24
24:   def add_tokens(tks)
25:     tks.each  {|tk| add_token(tk)}
26:   end

[Source]

    # File tokenstream.rb, line 24
24:   def add_tokens(tks)
25:     tks.each  {|tk| add_token(tk)}
26:   end

[Source]

    # File tokenstream.rb, line 28
28:   def pop_token
29:     @token_stream.pop
30:   end

[Source]

    # File doc-tmp/rdoc/tokenstream.rb, line 28
28:   def pop_token
29:     @token_stream.pop
30:   end

[Source]

    # File doc-tmp/rdoc/tokenstream.rb, line 16
16:   def start_collecting_tokens
17:     @token_stream = []
18:   end

[Source]

    # File tokenstream.rb, line 16
16:   def start_collecting_tokens
17:     @token_stream = []
18:   end

[Source]

    # File doc-tmp/rdoc/tokenstream.rb, line 12
12:   def token_stream
13:     @token_stream
14:   end

[Source]

    # File tokenstream.rb, line 12
12:   def token_stream
13:     @token_stream
14:   end

[Validate]