Risk management in PPC Advertising

Risk management in PPC Advertising

I am going to refer on this article more to risk management for advertising platforms like Google Adwords or Yahoo! Bing Network Contextual Ads. Here the bidding system is relatively easy to understand: highest bidder with quality ads will be displayed first.

The risk in PPC advertising comes from competitors. If one or more competitors will increase the bidding cost for certain keywords / expressions and you want a certain (first) position guaranteed, then it is a risk that you can reach the bottom very fast and end up with a inefficient PPC campaign.

What can be done?

Before answering this question, maybe we can ask ourselves “Why is our competitor is bidding that high?”. One answer is because he is irrational and he does not really understands the mechanics of PPC, or because he has a lot of money to spend and wants to eliminate any competition.

So what can you do? You can always work on the quality of your ads. Or you can always find new traffic from other keywords. But if this does not work and if you have budget limitations, you have to pass the comfort zone and be satisfied with the position bellow. In PPC advertising, the budget for the second position is calculated based on the third position, not on the first one. You have to be patient until your irrational competitor will finish his budget (because obviously, if he will engage into irrational bidding, he will soon face the fact that his bidding has not economically viable).

But what about when almost all of your competitors are deciding to raise their bids on a certain keywords / expressions that you once controlled? Are they all chaotic or irrational? Maybe it is a clue that they know something you do not know. Maybe their behaviour is not random and the advantage comes from a point of view still unexplored by you.

The standard team structure of an e-commerce web site

Startup structure for an e-commerce business

In a start-up business, the people hired have to be versatile. A person has to possess a variety of skills and can be involved in any part of the business, whenever is necessary. This is the beauty of an on-line e-commerce start-up: not only that each day can offer you a different experience, but you can also improve in a lot of skills. The model presented bellow can apply to a lot of e-commerce start-ups, but it is not necessarily the most efficient structure for any type of on-line shop.

In my opinion, a standard structure can be:

  • 2 persons always on web site management and product uploading (with the skills of an web site administrator: knowledge about HTML, CSS, image manipulation (in Adobe Photoshop ideally) and have some copywriting skills);
  • 1 person in touch with manufacturers and customer’s orders (this person is customer orientated, has excellent communication and organizational skills, initiative, with impeccable work ethic and financial skills);
  • 1 person in on-line sales (with HTML, CSS, on-line marketing skills (SEA, SEM, Social Media, Analytics), excellent communication skills: sending emails, preparing newsletters and promotions for customer loyalty and converting abandoned orders);
  • 1 e-commerce manager in charge of on-line and sales strategy, ideally a person with both technical and on-line marketing skills. Of course, project management and financial skills are necessary as well.

I haven’t mentioned about a web site development person, as this can be outsourced. Initially, an e-commerce platform have to be built or customized, and after, different modules will be bespoke developed.
What about you? What are your thoughts?

How to check the integrity of your XML Sitemap

Pages indexed in sitemap in Google Webmaster Tools

For this tutorial you will need Microsoft Excel, Xenu’s Link Sleuth and Notepad.

The search engine will look for valid web pages written in the XML Sitemap. And by valid pages I mean pages with the “OK 200″ http request answer. It will not crawl pages with Error 4xx or 5xx (Bad request 400, Unauthorized 401, Forbidden 403, Internal Error 500 etc.) or pages with redirection 3xx (Moved 301, Found 302 etc.)

So how can we make sure that our sitemap is good? If you are submitting a new sitemap through Webmasters Tools it will take a while until all the links will be crawled and checked by the search engine. To receive a feedback can take ages. Of course, you will receive an instant answer regarding the integrity of the code submitted, but what about those pages with 404 error or what about the permanently redirected 301 pages? They are not suppose to be in the sitemap and I will show you an easy way to detect them using Xenu’s Link Sleuth (Windows only freeware application) and Microsoft Excel.

Import your data from Sitemap to Microsoft Excel

1. Open an Excel file
2. Go to “Data” Menu Tab (Alt+A)
3. Load your XML “From Web” (presuming that the XML is already online at yourdomain.com/sitemap.xml).
4. Import Data into the Worksheet in a table (By default it will be named Table1)
5. Save it as XLS or XLSX
Sitemap imported in Microsoft Excel

Preparing the file to be imported by Xenu’s Link Sleuth

1. Open any text editor (I recommend Notepad++ for many reasons, you can download it from here: Download Notepad++)
2. From the Excel file select the entire column with URLs and copy & paste it into your text document
3. Save it as .txt file (E.g.: links-to-be-checked.txt)

Checking the URL’s integrity

Options in Xenu’s Link Sleuth

1. Open Xenu’s Link Sleuth (you can download the application from: snafu.de or cnet.com)
2. Click on Options > Preferences
3. Modify the default options with what you see in the print-screen. Very important to select “Maximum depth” field with 0 and check “Treat redirections as errors”
4. Load your html data by going File > Check URL List (Test)
5. In case you have a very large number of links, be patient. There is a big chance that some URLs will not be checked because of some hosting issues or the scanning to stop in the middle of the scanning process. You need to access File > Retry broken links (CTRL + R).
6. When the link spidering is finished, export your results by accessing File > Export to TAB separated file (E.g.: export-for-excel.txt)

Importing the results in Microsoft Excel

1. On a new Worksheet import the Xenu’s Link Sleuth’s file by accessing “Data” Menu Tab (Alt + A) and after click the “From text” button;
2. Follow the Text Import Wizard: “Delimited – Characters such as commas or tabs separate each field” checked, “My data has headers” checked. After press “Next” button and make sure you have “Delimiters: Tab” checked and press Finish.
3. Convert the results into a table by selecting the whole table and pressing “CTRL+L” (By default it will be named Table2)

Synchronising the results using VLOOKUP function in Microsoft Excel

1. Add another column to your first table (E.g.: “Status Code”)
2. On the first cell write the following:

=VLOOKUP(A7, Table2[[Address]:[Status-Code]], 2, FALSE)
A7 – Lookup value
Table2[[Address]:[Status-Code]] – The imported table from Xenu’s Link Sleuth with the first 2 columns: Address and Status-Code
2 – The column index number, will return the value from the second column from the Table2, named Status-Code
FALSE – The lookup value will have to match perfectly with the search field

3. Apply the formula to all the cells bellow
4. Finally, you have your results in one table, now you can take measures and remove or fix those links which have different “HTTP OK 200″ from the XML Sitemap. Use the filter from Excel, like in the image, to view your web pages with problems.

In case you are missing something, you can download the final Excel file with the Sitemap already checked from here: how-to-check-the-integrity-of-your-xml-sitemap.xlsx