OpenMP version of code does not work across nodes
When combination of tasks and cores per task exceeds one node, MPI does not pass messages correctly.
Running with 5 tasks with 5 cores per task
...
Initializing Grids on level 3
level -2
level -1
Error - trying to unpack additional message blocks without posting receives
If message is not a multi-block message - this could be due to a pre-calculation error
message%lMultiBlock= F 14 0 0