Click here to Skip to main content
15,881,852 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
I understand the tokenizing part. I have the lex reading the file, splitting the tokens, labeling them accordingly, and displaying it in the console. What is the process after this?

1)At this point do I try to pull the defines from preprocessor and lex again? Which would be in the lex.h file, or in a def file like Dr.Dobbs?

2)Do I make the symbol table at this point?

3)How do I pass the lexed input to the parser, get the treenode, and start matching, or the linking part of this?

There has to be some kind soul that will help me connect the dots. Lol.

What I have tried:

I have tried a few compiler books but the way they structure it, they leave out the linking part, as well as many of the sites online.
Posted
Updated 1-Aug-18 20:51pm

1 solution

Just Googling you may find many step-by-step tutorials on this very topic. See, for instance: Writing Your Own Toy Compiler Using Flex, Bison and LLVM (gnuu.org)[^].
 
Share this answer
 
Comments
RavenLee 2-Aug-18 11:54am    
I want to write one myself. I had fun writing the lexer. Thanks doe.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900