I understand tokenizing. I can get the lex to read the file, split the tokens, and display it in the console. What is the process after this? How do I retrieve the tokenized information or pass it to the parser, and connect with the treenode? The code I have prints the whole buffer that is read, and then breaks each into tokens, and labels them, according.
1)At this point do I try to pull the defines from preprocessor and lex again? Which would be in the lex.h file, or in a def file like Dr.Dobbs?
2)Do I call the token.h and treenode.h and start trying to match them?
3)Do I make the symbol table at this point?
There has to be some kind soul that will help me connect the dots. Lol.
What I have tried:
.