P4 Loading Large Amounts of Table Entries from Control Plane

I’ve written a functional P4 program that works for small datasets. I’m currently trying to apply a larger dataset with many more table entries/data. I’m using the P4 tutorial VM and the Mininet Makefile/process for reading the topology json as well as the s1 switch json file. I’m loading the table entries in via the s1 switch json file which is fairly large.

When I run make run, I keep getting the following error:

I’ve fiddled with the json file and noticed that if I delete the problematic entry, the issue persists with the next subsequent entry so my hunch is some timeout issue or maybe it’s hitting some sort of space limitation? Any help with this will be appreciated. Thank you!

Hi @ThePuriProdigy,

  • Did you configure the size of the table within the P4 program, by default should be 1024 entries for each table

  • In your json file you can looking for the "max_size" field, this field tells you the maximum number of entries a table can “handle/maintain”

Thanks so much for the quick reply @DavideS

I did not configure the size of the table as I assumed that by default, P4 would automatically increase/scale to the dataset which I see now might be causing the issue. I do not see a max_size field anywhere. Is it supposed to be there? Thanks!

I do not see a max_size field anywhere. Is it supposed to be there?

Yes, you should have this field in the json file, in oter word the output file provided by the compiler. As you can see in the P4 program is defined the size of table exact_1
(behavioral-model/exact_match_1.p4 at 27c235944492ef55ba061fcf658b4d8102d53bd8 · p4lang/behavioral-model · GitHub) and this value is mapped into the json file generated by the compiler(
behavioral-model/exact_match_1.json at 27c235944492ef55ba061fcf658b4d8102d53bd8 · p4lang/behavioral-model · GitHub)

Thanks so much for the help!