jeremiah-c-leary / vhdl-style-guide

Style guide enforcement for VHDL
GNU General Public License v3.0
175 stars 38 forks source link

Local Rule Creation Documentation #1174

Open Benito-E opened 1 month ago

Benito-E commented 1 month ago

Hello!

I've been working for a little while on trying to create some local rules for vsg, but when I look through both the stable and latest docs for vsg, the page for localization is out of date... I was wondering if anyone would be able to give me a rundown on how local rule creation has changed since the docs for it were made.

There are a number of rules I'd like to try and create, but for starters I want to see if I can make a rule that enforces the existence of comments at the beginning of the file, for the sake of self documentation. Past that, If I could make vsg enforce specific templates or specific multi-line comment strings, that would be even better. But for starters I'd like to know how to create a simple rule, even perhaps recreating one that already exists built into vsg.

Thank you!

Benito-E commented 1 month ago

Some additional questions based on my further research useful as notes to myself:

is the parser.beginning_of_file class type a supported token? I only see it used in a few locations, and none of those locations are rules... When I try to make a child rule of the existence_of_tokens_which_should_not_occur rule and give it parser.beginning_of_file (just to check if vsg's tokenization appends the bof), the rule doesn't cause any errors, but it also doesn't do anything, and I would guess this is because, assuming that vsg doesn't tokenize the bof, the bof token will never be in the list of tokens, and therefore could never violate the existence rule

jeremiah-c-leary commented 1 month ago

Evening @Benito-E ,

I've been working for a little while on trying to create some local rules for vsg, but when I look through both the stable and latest docs for vsg, the page for localization is out of date... I was wondering if anyone would be able to give me a rundown on how local rule creation has changed since the docs for it were made.

The process is still the same. You can find an example under tests/vsg/local_rules. As for creating rules check out this section of the documentation.

There are a number of rules I'd like to try and create

Would you be willing to elaborate? These rules may be worth adding to VSG for others to use.

but for starters I want to see if I can make a rule that enforces the existence of comments at the beginning of the file, for the sake of self documentation.

Would this be something like checking for header?

Past that, If I could make vsg enforce specific templates or specific multi-line comment strings, that would be even better.

You should check out the base_comment rules for a template. They check for specific formatting of comments.

But for starters I'd like to know how to create a simple rule, even perhaps recreating one that already exists built into vsg.

That is not a bad idea. I would create a directory and add the __init__.py file. Then copy one of the existing rules over and use the -lr option. That would let you know the local rules are working. Then you can start changing that rule or adding another one.

is the parser.beginning_of_file class type a supported token?

I do not believe that token is used. I must have been a left over from a previous attempt at something. I think I was trying to solve a problem, but ended up using something else.

If you have any other questions or would like to collaborate just leave me a message.

Regards,

--Jeremy

Benito-E commented 1 month ago

Hi @jeremiah-c-leary !

Since yesterday I've made a lot of progress with rule creation, in part due to your explanations! I've created two rules: one that explicitly prohibits the use of the buffer keyword. That rule is a child rule of the existence_of_tokens_which_should_not_occur parent rule, and is a very simple rule due to the parent rule's already-present functionality. The other rule I created was, as you put it above, a rule that checks for a header. The the particular heading-string it would check for is put manually into the rules configuration via config file, and the rule checks the first n lines (as strings) of the file being tested making sure that they match. If they don't match, I give the violation.New() constructor a line number of 1, the class of the first detected token, and my solution statement. It essentially looks like this: localConfig.json

{
    "rule" : {
        "local_001" : {
            "disable" : false,
            "header" : [
                "--------------------------------------------------------------------------------",
                "-- This is a header that must appear at the beginning of every file",
                "--------------------------------------------------------------------------------"
            ]
        }
    }
}

rule001.py (pseudocode)_:

class rule_001(rule.Rule):
    def __init__(self):
        self.header = []
        self.configuration.append("header")
        self.otherSelfStuff = ...

    def _get_tokens_of_interest(self, oFile):
        return oFile.get_lines()._appended_with_(<the first token in the file>)          # first token to be used if a violation is created

    def _analyze(self, lToi):
        for i, sLine in enumerate(self.header):
             if (sLine != lToi[i+1]):                                #+1 because get_lines() adds an empty string at index 0
                self.add_violation(violation.New(1, lToi[-1], self.solution)

since the header to be checked is in the config file, it can be easily modified by users to their own particular needs. This could probably turn into a much larger rule if it allowed header templates of some kind, with un-checked strings in-between like name or date or whatnot. Regardless what I have is something like what I put above, and it seems to work well.

I do not believe that token is used

Before I figured things out, I'd thought I could make a rule that simply checks if a custom token (a token defined by any user defined string) appeared after the parser.beginning_of_file token. I figured things out and it seems to work well so I don't necessarily need this anymore, but I figured I'd elaborate on my thought process.

Would you be willing to elaborate? These rules may be worth adding to VSG for others to use.

I think I got ahead of myself when I said I had 'a number' of rules; the rules I've made above were the main ones I wanted to create. The only other idea I would have would be to create a set of rules whose purpose isn't to enforce a particular thing, but rather to be flexibly configurable by users to enforcer a large number of things. For example the idea I had above that included a user-defined token, for which you could check the existence of or location of. I can't pretend to imagine how a rule like that would be implemented though. In any case, thank you for the assistance you gave me, and I hope my pseudocode and silly ideas are of use to you. Cheers!

--Benito

jeremiah-c-leary commented 1 month ago

Afternoon @Benito-E ,

Since yesterday I've made a lot of progress with rule creation, in part due to your explanations!

I'm glad you were able to add your rules.

I've created two rules: one that explicitly prohibits the use of the buffer keyword. That rule is a child rule of the existence_of_tokens_which_should_not_occur parent rule, and is a very simple rule due to the parent rule's already-present functionality.

That is a good rule. I should have thought of that one.

There are quite a few base rules that will do most of what you would want to check. That makes adding new rules fairly easy.

The other rule I created was, as you put it above, a rule that checks for a header.

I like that you made the header configurable.

It seems that you have a good grasp of how to create rules. Is there anything that could have been better documented to make it easier?

--Jeremy

Benito-E commented 4 weeks ago

@jeremiah-c-leary Apologies for the late reply,

I'd say the only thing that really needs updating is just that This Page is out of date.

From my general understanding of this page, much of what's documented there is the same, except the sorts of functions that are required for the rule to be called and run. If I'm not mistaken, in the current version of VSG, local rules can be either:

  1. Extended from a base rule or
  2. Built as a unique rule

And if the latter option is chosen, the required functions are either simply:

  1. An overloaded analyze(self, oFile): method (in which case you may need to consider some other things like code tags) or
  2. The _analyze(self, lToi): and _get_tokens_of_interest(self, oFile): methods

My understanding could be somewhat incorrect but it has served me well thus far, and I would say that if my understanding is correct, the documentation page, that mentions other deprecated methods such as _pre_analyze(), simply needs to be updated

--Benito