How to use GSA SER to make links to your site

After the previous article in which I recommended highly using GSA products to create links to your site, I intend now to explain the methods of using it.

Please note that GSA SER is a little complicated and there are lots of finely grained options, most of which I won't talk about, I will just point out several important points I came across while using it.

Work in Tiers

You don't want to point bad links to your sites, thus you want to work in tiers as follows: Best links which means the links from full articles you submit to sites/profiles/wikis etc should point to your site - This will be your first tier containing good contextual links, these should be do-follow.

Your second tier will contain less important links, but still good enough (Profile Links, pages dedicated to your link) - these will point to the first tier links.

Your third and last tier is the tier replacing an indexer as this tier will point fast to gather but shitty links to your second tier links [The engines you should select here will be all the engines which don't need an email verification as they will be much faster].

If you want to use an indexer then point only the final tier to the Indexer as the rest of the tiers need no indexer as indexed links will already point to them.

The difference between the tiers will be in the engine types you select.
Creating a higher tier is a matter of clicking a single click in GSA SER - search it out (Modify project -> Duplicate -> Tier project).

I suggest dripping links to your site and not blasting millions of links per day, so the dripping should be something as follows:
Tier1: 8 Do follow links/day (Articles)
Tier2: 8 DoFollow links/day (Profiles)
Tier3: 20 links/day (fast no-email-needed, do-follow links)

Some other small project tips


  • In order to drip links/per day, use the "Pause project...VERIFICATIONS ... per URL" feature
  • I suggest having a smaller Re-Verify links value when the tier is lower, so for Tier 1 I would sugest re-verifying the links every 720 minutes, tier 2 every 1200 minutes, final tier (Indexing tier) every 6000 minutes ( As there is no real need to know if the final tier links are really there); Re verification is important in order not to build links from other tiers to non-existent links, which will remove lots of load from your SER machine.
  • Another biggie is to not build links to your tiered links if at least 2 days didn't pass, this way, if the link falls within the day, you won't build links to it. This can be achieved in the Prohect -> Data tab under "Tier Filter Options" -> Check URLs age in days -> Put the Value of 2-7 which will mean that any new link of yours will get juice from higher tiers from day to to day 7, then SER won't send any other links to it (as you need the SER power elsewhere).


Indexing

If for some reason you decide to work with an indexer, I highly suggest using the GSA SEO Indexer, it is very stable and very good.
  • In GSA SEO Indexer there is no need to use proxies
  • Use full indexer
  • Use Deep indexing which means it will use only sites which can index the specific URL and not only the whole domain (which is useless)
  • Don't filter by PR!

PageRank

Disable all functions regarding PageRank, it is outdated and nobody really uses it, and on the downside it costs you dearly in proxies and network time.


Proxies

You can use public proxies, there is no real need (unless you are super professional and have hundreds of projects/have customers and know what you are doing, in which case you will know of SER than me anyway) for private proxies.

SER has a very powerful engine which scrapes online updated lists and gets you enough proxies for your tasks.

Just note that proxies are much slower than using no-proxy so pay special attention which tasks you are using with proxies, and check only those you know need proxies, leave all of the rest unchecked.

Disable you AntiVirus

The viruses in the scraped sites can't infect you but your anti virus will check them all and will drive you crazy with alerts.

If you are not sure then work with a virtual machine.


Groups

In GSA SER group your projects into dedicated groups, make them logical it will move much of the mess and make your work agreeable.


Templates

There is no real need for a project creator, just do your research, learn GSA SER, then create the perfect project - it will take you time, but then click on tools and export the project as a Template, from now and on create a new project always from that template, that way all of your options etc will be saved on the temptae, the only thing you will need to change will be the articles, keywords and the main URL/s.

Weekly maintenance

I recommend doing maintenance once in a while (every week or two ?), in which you should check all created links to see if they exist/are indexed, for this task I recommend using ScrapeBox (70$), Scrapebox will help you in all of your maintenance tasks.

Backup your options/templates and Projects.

Look at this article for more maintenance ideas:
https://asiavirtualsolutions.com/maintanance-for-gsa-search-engine-ranker/

Comments

Popular posts from this blog

Profiling Java @ 2019

Ant Explorer graphical Ant build tool

Start Working with AutoIT