Function calls are the easiest to work with,
since they require no parser changes.
A function call is represented as a *parser.CallExpr
and specific translations are registered
in initKnownFunctions
in pql.go.
- Add a new entry to the function table in
initKnownFunctions
. - Create a callback function that writes the SQL you want.
Callbacks can inspect the full AST of their function call
and can use
writeExpressionMaybeParen
to translate their arguments.
Tabular operators are more involved than scalar functions because each operator's syntax is distinct. PQL uses a recursive descent parser, so parsing rules are written as plain Go code that consumes tokens.
- Create a new struct for your operator in ast.go
that implements
TabularOperator
. Look to otherTabularOperator
types in ast.go for inspiration. - Add another case into the
switch
inside theWalk
function to support your new operator. - Add another case into the
switch
inside the*parser.tabularExpr
method. - Add a parsing method to
*parser
that converts tokens into your newTabularOperator
type. You can look at*parser.whereOperator
and*parser.takeOperator
as basic examples. - Add a test to ensure that your tabular operator is parsed as you expect.
Now for compilation:
- Add a new case to
*subquery.write
in pql.go to transform your parsed tabular operator struct into aSELECT
statement. - If necessary, add logic into the
canAttachSort
function to signal the behavior of the operator to the subquery split algorithm. - Add an end-to-end test to verify that your operator compiles as expected.
See the commit that introduced the as
operator for an example.
If you're adding a new syntactical element, you should first add it to the lexer.
- Write a test that uses your new token. It will fail to start. Defects in lexing can lead to surprising problems during parsing, so it's important to always test lexing independently.
- Add the new token type to the list in lex.go.
- Run
go generate
inside theparser
directory. - Modify the
Scan
function to detect your token. As a special case, if you are adding a new keyword, you can add it to thekeywords
map variable. - Re-run your test to ensure it passes.
You will then need to modify the parser to handle this new type of token. The exact set of changes varies depending on how the token is used, but the process will be similar to adding a new tabular operator. See the commit that introduced indexing expressions for an example.
Add an example to the Golden tests in the testdata directory demonstrating the usage of the new feature.