Global attention can mean many things. A few examples are:
concatenate global input features to the graph embedding layer created by the message passing layers.
a "slight" generalization of the above consists in passing the global input features into an MLP, and then concatenate the output to the graph embedding generated by the MPNN layers
resources to consider