transform streamed glsl tokens into an ast
npm install glsl-parser javascript
var TokenStream = require('glsl-tokenizer/stream')
var ParseStream = require('glsl-parser/stream')
var fs = require('fs')
fs.createReadStream('test.glsl')
.pipe(TokenStream())
.pipe(ParseStream())
.on('data', function(x) {
console.log('ast of', x.type)
})
`
$3
The full program's AST, which will be updated with each incoming token.
$3
Synchronously parses an array of tokens from glsl-tokenizer.
` javascript
var TokenString = require('glsl-tokenizer/string')
var ParseTokens = require('glsl-parser/direct')
var fs = require('fs')
var src = fs.readFileSync('test.glsl', 'utf8')
var tokens = TokenString(src)
var ast = ParseTokens(tokens)
console.log(ast)
`
Nodes
* stmtlist
* stmt
* struct
* function
* functionargs
* decl
* decllist
* forloop
* whileloop
* if
* expr
* precision
* comment
* preprocessor
* keyword
* ident
* return
* continue
* break
* discard
* do-while
* binary
* ternary
* unary
Known Issues
* because i am not smart enough to write a fully streaming parser, the current parser "cheats" a bit when it encounters a expr node! it actually waits until it has all the tokens it needs to build a tree for a given expression, then builds it and emits the constituent child nodes in the expected order. the expr parsing is heavily influenced by crockford's tdop article. the rest of the parser is heavily influenced by fever dreams.
the parser might hit a state where it's looking at what could be* an expression, or it could be a declaration --
that is, the statement starts with a previously declared struct. it'll opt to pretend it's a declaration, but that
might not be the case -- it might be a user-defined constructor starting a statement!
* "unhygenic" #if / #endif` macros are completely unhandled at the moment, since they're a bit of a pain.