9ci / 9ci.github.io

http://9ci.github.io
GNU General Public License v3.0
0 stars 3 forks source link

Technical SEO Issues #17

Closed jdabal closed 4 years ago

sushantkumbhar9 commented 4 years ago

9ci.com - Website Health Audit https://docs.google.com/spreadsheets/d/1JuY0BRSwYG-J0v2X1jzx5UaIbjq3T7RQMgnHQTSINS8/edit#gid=1305372726

sushantkumbhar9 commented 4 years ago

Hi @jdabal

Robots.txt

Need to Disallow our Index of pages, we need to do this because they are showing on Google Search Result.

https://www.9ci.com/robots.txt

check this screenshot image

Sitemap.xml

Are we giving these services as shown on the given screenshot below?

image

Page Speed

According to GTMetrix, our Page speed score is 53%. https://gtmetrix.com/reports/www.9ci.com/44gLk5sm

Check this screenshot image

Need to work on these to fix page speed score image

WhitehatSEO-Expert commented 4 years ago

9ci.com - On-Page L2 Recommendation https://docs.google.com/document/d/1BCb8-IWJkN-tantowCn2ZMO3Ra0jAE--_0huSwcp0Ao/edit

jdabal commented 4 years ago

@sushantkumbhar9 Good document On-Page L2 Recommendation but it is a little bit out of date:

  1. https://www.9ci.com/robots.txt is not just sitemap.xml. . At one point you said that we should add sitemap to robots.txt, should we still do it?

  2. Structured Data Schema Markup. -- what are your suggestions ? I though we did them all. what else have to be done ?

  3. Browser Compatibility Test -- I thought we talked about it and it was 100% good, Now i see issues using your test tool. Should we fix them? How critical is it?

WhitehatSEO-Expert commented 4 years ago

Yes, this is an old report. I was just shifting all my previous work to Github that's why you are seeing an out of date report here.

Yes, we still need to add sitemap URL in the robots.txt file. This will help Google to index our website changes in a better way.

You have added structured data but, that is only added on the home page. We need schema on all the pages. I have one relatively effective solution for this. Since we have GTM now, I can create triggers for Structured Data from GTM and it will imply to the whole site quickly. And you don't need to worry about it anywhere.

With Browser Compatibility, we still do have some issues there. Our website is not fully optimized for different browsers. This is not a critical issue, we can work on it later once we have processed all the other priority tasks.

jdabal commented 4 years ago

How about your comment from above: Need to Disallow our Index of pages, we need to do this because they are showing on Google Search Result.

What do i have to do ? I don't understand

jdabal commented 4 years ago

@sushantkumbhar9

  1. Give me example of what structured data that I should add for example to solutions page https://www.9ci.com/solutions/

  2. Robots.txt has schema.xml now

  3. what is "Need to Disallow our Index of pages" ?

WhitehatSEO-Expert commented 4 years ago

We need to stop Google from showing up these pages types of pages: http://repo.9ci.com/oss-snapshots/org/grails/plugins/gorm-tools/ http://repo.9ci.com/oss-snapshots/org/grails/

Either we can do this by Google Search Console or with by putting a Robots.txt on repo.9ci.com

WhitehatSEO-Expert commented 4 years ago
  1. Solutions page won't be a right example. Let me give a better examples..... if I pick a blog post like https://www.9ci.com/articles/collection-metrics-cei-dso-collection-efficiency-index I will add a article schema on this page.

And add logo and social media schema sitewide to be shown up on all pages.

  1. Thanks for the update.

  2. Already replied.

jdabal commented 4 years ago

created separate issues for https://github.com/9ci/9ci.github.io/issues/35 and https://github.com/9ci/9ci.github.io/issues/36