A word embedding is a word-context encoding vector.
A lexical embedding is a lexical-pattern-context encoding vector.
For each l-gram of size equal to the lexical pattern in text there is a 0 or 1 representing it's position. Then take the sum of lexical patterns across text at each position to get real-valued embeddings. Then you have lexical features instead of word features.
A word embedding is a word-context encoding vector.
A lexical embedding is a lexical-pattern-context encoding vector.
For each l-gram of size equal to the lexical pattern in text there is a 0 or 1 representing it's position. Then take the sum of lexical patterns across text at each position to get real-valued embeddings. Then you have lexical features instead of word features.